00:00:00.000 Started by upstream project "autotest-nightly-lts" build number 2382 00:00:00.000 originally caused by: 00:00:00.000 Started by upstream project "nightly-trigger" build number 3643 00:00:00.000 originally caused by: 00:00:00.000 Started by timer 00:00:00.018 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.024 The recommended git tool is: git 00:00:00.024 using credential 00000000-0000-0000-0000-000000000002 00:00:00.025 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.036 Fetching changes from the remote Git repository 00:00:00.038 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.053 Using shallow fetch with depth 1 00:00:00.053 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.053 > git --version # timeout=10 00:00:00.068 > git --version # 'git version 2.39.2' 00:00:00.068 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.080 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.081 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.252 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.265 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.273 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.273 > git config core.sparsecheckout # timeout=10 00:00:03.285 > git read-tree -mu HEAD # timeout=10 00:00:03.299 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.315 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.315 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.437 [Pipeline] Start of Pipeline 00:00:03.449 [Pipeline] library 00:00:03.451 Loading library shm_lib@master 00:00:03.451 Library shm_lib@master is cached. Copying from home. 00:00:03.465 [Pipeline] node 00:00:03.482 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.483 [Pipeline] { 00:00:03.491 [Pipeline] catchError 00:00:03.492 [Pipeline] { 00:00:03.502 [Pipeline] wrap 00:00:03.510 [Pipeline] { 00:00:03.518 [Pipeline] stage 00:00:03.520 [Pipeline] { (Prologue) 00:00:03.717 [Pipeline] sh 00:00:03.998 + logger -p user.info -t JENKINS-CI 00:00:04.017 [Pipeline] echo 00:00:04.019 Node: WFP20 00:00:04.026 [Pipeline] sh 00:00:04.329 [Pipeline] setCustomBuildProperty 00:00:04.343 [Pipeline] echo 00:00:04.345 Cleanup processes 00:00:04.350 [Pipeline] sh 00:00:04.635 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.635 1179975 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.648 [Pipeline] sh 00:00:04.967 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.967 ++ grep -v 'sudo pgrep' 00:00:04.967 ++ awk '{print $1}' 00:00:04.967 + sudo kill -9 00:00:04.967 + true 00:00:04.981 [Pipeline] cleanWs 00:00:04.992 [WS-CLEANUP] Deleting project workspace... 00:00:04.992 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.000 [WS-CLEANUP] done 00:00:05.005 [Pipeline] setCustomBuildProperty 00:00:05.018 [Pipeline] sh 00:00:05.300 + sudo git config --global --replace-all safe.directory '*' 00:00:05.398 [Pipeline] httpRequest 00:00:05.759 [Pipeline] echo 00:00:05.760 Sorcerer 10.211.164.20 is alive 00:00:05.770 [Pipeline] retry 00:00:05.771 [Pipeline] { 00:00:05.787 [Pipeline] httpRequest 00:00:05.792 HttpMethod: GET 00:00:05.792 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.793 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.796 Response Code: HTTP/1.1 200 OK 00:00:05.796 Success: Status code 200 is in the accepted range: 200,404 00:00:05.796 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.482 [Pipeline] } 00:00:06.499 [Pipeline] // retry 00:00:06.506 [Pipeline] sh 00:00:06.786 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.803 [Pipeline] httpRequest 00:00:07.105 [Pipeline] echo 00:00:07.108 Sorcerer 10.211.164.20 is alive 00:00:07.117 [Pipeline] retry 00:00:07.118 [Pipeline] { 00:00:07.129 [Pipeline] httpRequest 00:00:07.134 HttpMethod: GET 00:00:07.134 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:07.134 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:07.147 Response Code: HTTP/1.1 200 OK 00:00:07.148 Success: Status code 200 is in the accepted range: 200,404 00:00:07.148 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:19.580 [Pipeline] } 00:01:19.601 [Pipeline] // retry 00:01:19.609 [Pipeline] sh 00:01:19.898 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:22.444 [Pipeline] sh 00:01:22.730 + git -C spdk log --oneline -n5 00:01:22.730 c13c99a5e test: Various fixes for Fedora40 00:01:22.730 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:22.730 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:22.730 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:22.730 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:22.742 [Pipeline] } 00:01:22.757 [Pipeline] // stage 00:01:22.766 [Pipeline] stage 00:01:22.769 [Pipeline] { (Prepare) 00:01:22.785 [Pipeline] writeFile 00:01:22.802 [Pipeline] sh 00:01:23.087 + logger -p user.info -t JENKINS-CI 00:01:23.099 [Pipeline] sh 00:01:23.383 + logger -p user.info -t JENKINS-CI 00:01:23.395 [Pipeline] sh 00:01:23.680 + cat autorun-spdk.conf 00:01:23.680 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:23.680 SPDK_TEST_FUZZER_SHORT=1 00:01:23.680 SPDK_TEST_FUZZER=1 00:01:23.680 SPDK_RUN_UBSAN=1 00:01:23.688 RUN_NIGHTLY=1 00:01:23.692 [Pipeline] readFile 00:01:23.718 [Pipeline] withEnv 00:01:23.720 [Pipeline] { 00:01:23.733 [Pipeline] sh 00:01:24.019 + set -ex 00:01:24.019 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:24.019 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:24.019 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:24.019 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:24.019 ++ SPDK_TEST_FUZZER=1 00:01:24.019 ++ SPDK_RUN_UBSAN=1 00:01:24.019 ++ RUN_NIGHTLY=1 00:01:24.019 + case $SPDK_TEST_NVMF_NICS in 00:01:24.019 + DRIVERS= 00:01:24.019 + [[ -n '' ]] 00:01:24.019 + exit 0 00:01:24.029 [Pipeline] } 00:01:24.044 [Pipeline] // withEnv 00:01:24.050 [Pipeline] } 00:01:24.065 [Pipeline] // stage 00:01:24.075 [Pipeline] catchError 00:01:24.078 [Pipeline] { 00:01:24.093 [Pipeline] timeout 00:01:24.093 Timeout set to expire in 30 min 00:01:24.095 [Pipeline] { 00:01:24.111 [Pipeline] stage 00:01:24.112 [Pipeline] { (Tests) 00:01:24.127 [Pipeline] sh 00:01:24.412 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:24.412 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:24.412 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:24.412 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:24.412 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:24.412 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:24.412 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:24.412 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:24.412 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:24.412 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:24.412 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:24.412 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:24.412 + source /etc/os-release 00:01:24.412 ++ NAME='Fedora Linux' 00:01:24.412 ++ VERSION='39 (Cloud Edition)' 00:01:24.412 ++ ID=fedora 00:01:24.412 ++ VERSION_ID=39 00:01:24.412 ++ VERSION_CODENAME= 00:01:24.412 ++ PLATFORM_ID=platform:f39 00:01:24.412 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:24.412 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:24.412 ++ LOGO=fedora-logo-icon 00:01:24.412 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:24.412 ++ HOME_URL=https://fedoraproject.org/ 00:01:24.412 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:24.412 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:24.412 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:24.412 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:24.412 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:24.412 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:24.412 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:24.412 ++ SUPPORT_END=2024-11-12 00:01:24.412 ++ VARIANT='Cloud Edition' 00:01:24.412 ++ VARIANT_ID=cloud 00:01:24.412 + uname -a 00:01:24.412 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:24.412 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:27.705 Hugepages 00:01:27.705 node hugesize free / total 00:01:27.705 node0 1048576kB 0 / 0 00:01:27.705 node0 2048kB 0 / 0 00:01:27.705 node1 1048576kB 0 / 0 00:01:27.705 node1 2048kB 0 / 0 00:01:27.705 00:01:27.705 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:27.705 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:27.705 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:27.705 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:27.705 + rm -f /tmp/spdk-ld-path 00:01:27.705 + source autorun-spdk.conf 00:01:27.705 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:27.705 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:27.705 ++ SPDK_TEST_FUZZER=1 00:01:27.705 ++ SPDK_RUN_UBSAN=1 00:01:27.705 ++ RUN_NIGHTLY=1 00:01:27.705 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:27.705 + [[ -n '' ]] 00:01:27.705 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:27.705 + for M in /var/spdk/build-*-manifest.txt 00:01:27.705 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:27.705 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:27.705 + for M in /var/spdk/build-*-manifest.txt 00:01:27.705 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:27.705 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:27.705 + for M in /var/spdk/build-*-manifest.txt 00:01:27.705 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:27.705 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:27.705 ++ uname 00:01:27.705 + [[ Linux == \L\i\n\u\x ]] 00:01:27.705 + sudo dmesg -T 00:01:27.705 + sudo dmesg --clear 00:01:27.705 + dmesg_pid=1180866 00:01:27.705 + [[ Fedora Linux == FreeBSD ]] 00:01:27.705 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:27.705 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:27.705 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:27.705 + [[ -x /usr/src/fio-static/fio ]] 00:01:27.705 + export FIO_BIN=/usr/src/fio-static/fio 00:01:27.705 + FIO_BIN=/usr/src/fio-static/fio 00:01:27.705 + sudo dmesg -Tw 00:01:27.705 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:27.705 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:27.705 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:27.705 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:27.705 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:27.705 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:27.705 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:27.705 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:27.705 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:27.705 Test configuration: 00:01:27.705 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:27.705 SPDK_TEST_FUZZER_SHORT=1 00:01:27.705 SPDK_TEST_FUZZER=1 00:01:27.705 SPDK_RUN_UBSAN=1 00:01:27.705 RUN_NIGHTLY=1 18:59:46 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:27.706 18:59:46 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:27.706 18:59:46 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:27.706 18:59:46 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:27.706 18:59:46 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:27.706 18:59:46 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:27.706 18:59:46 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:27.706 18:59:46 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:27.706 18:59:46 -- paths/export.sh@5 -- $ export PATH 00:01:27.706 18:59:46 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:27.706 18:59:46 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:27.706 18:59:46 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:27.706 18:59:46 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1731952786.XXXXXX 00:01:27.706 18:59:46 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1731952786.q1PjyM 00:01:27.706 18:59:46 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:27.706 18:59:46 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:01:27.706 18:59:46 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:01:27.706 18:59:46 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:27.706 18:59:46 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:27.706 18:59:46 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:27.706 18:59:46 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:27.706 18:59:46 -- common/autotest_common.sh@10 -- $ set +x 00:01:27.706 18:59:46 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:01:27.706 18:59:46 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:27.706 18:59:46 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:27.706 18:59:46 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:27.706 18:59:46 -- spdk/autobuild.sh@16 -- $ date -u 00:01:27.706 Mon Nov 18 05:59:46 PM UTC 2024 00:01:27.706 18:59:46 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:27.706 LTS-67-gc13c99a5e 00:01:27.706 18:59:46 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:27.706 18:59:46 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:27.706 18:59:46 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:27.706 18:59:46 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:27.706 18:59:46 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:27.706 18:59:46 -- common/autotest_common.sh@10 -- $ set +x 00:01:27.706 ************************************ 00:01:27.706 START TEST ubsan 00:01:27.706 ************************************ 00:01:27.706 18:59:46 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:27.706 using ubsan 00:01:27.706 00:01:27.706 real 0m0.000s 00:01:27.706 user 0m0.000s 00:01:27.706 sys 0m0.000s 00:01:27.706 18:59:46 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:27.706 18:59:46 -- common/autotest_common.sh@10 -- $ set +x 00:01:27.706 ************************************ 00:01:27.706 END TEST ubsan 00:01:27.706 ************************************ 00:01:27.706 18:59:46 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:27.706 18:59:46 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:27.706 18:59:46 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:27.706 18:59:46 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:27.706 18:59:46 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:27.706 18:59:46 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:27.706 18:59:46 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:27.706 18:59:46 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:27.706 18:59:46 -- common/autotest_common.sh@10 -- $ set +x 00:01:27.706 ************************************ 00:01:27.706 START TEST autobuild_llvm_precompile 00:01:27.706 ************************************ 00:01:27.706 18:59:46 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:01:27.706 18:59:46 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:27.706 18:59:46 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:27.706 Target: x86_64-redhat-linux-gnu 00:01:27.706 Thread model: posix 00:01:27.706 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:27.706 18:59:46 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:27.706 18:59:46 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:27.706 18:59:46 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:27.706 18:59:46 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:27.706 18:59:46 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:27.706 18:59:46 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:27.706 18:59:46 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:27.706 18:59:46 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:27.706 18:59:46 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:27.706 18:59:46 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:27.966 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:27.966 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:28.227 Using 'verbs' RDMA provider 00:01:44.062 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:56.277 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:56.277 Creating mk/config.mk...done. 00:01:56.277 Creating mk/cc.flags.mk...done. 00:01:56.277 Type 'make' to build. 00:01:56.277 00:01:56.277 real 0m28.410s 00:01:56.277 user 0m12.401s 00:01:56.277 sys 0m15.372s 00:01:56.277 19:00:14 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:56.277 19:00:14 -- common/autotest_common.sh@10 -- $ set +x 00:01:56.277 ************************************ 00:01:56.277 END TEST autobuild_llvm_precompile 00:01:56.277 ************************************ 00:01:56.277 19:00:14 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:56.277 19:00:14 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:56.277 19:00:14 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:56.277 19:00:14 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:56.277 19:00:14 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:56.537 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:56.537 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:56.856 Using 'verbs' RDMA provider 00:02:09.643 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:21.860 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:21.860 Creating mk/config.mk...done. 00:02:21.860 Creating mk/cc.flags.mk...done. 00:02:21.860 Type 'make' to build. 00:02:21.860 19:00:38 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:21.860 19:00:38 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:21.860 19:00:38 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:21.860 19:00:38 -- common/autotest_common.sh@10 -- $ set +x 00:02:21.860 ************************************ 00:02:21.860 START TEST make 00:02:21.860 ************************************ 00:02:21.860 19:00:38 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:21.860 make[1]: Nothing to be done for 'all'. 00:02:22.119 The Meson build system 00:02:22.119 Version: 1.5.0 00:02:22.119 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:22.119 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:22.119 Build type: native build 00:02:22.119 Project name: libvfio-user 00:02:22.119 Project version: 0.0.1 00:02:22.119 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:22.119 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:22.119 Host machine cpu family: x86_64 00:02:22.119 Host machine cpu: x86_64 00:02:22.119 Run-time dependency threads found: YES 00:02:22.119 Library dl found: YES 00:02:22.119 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:22.119 Run-time dependency json-c found: YES 0.17 00:02:22.119 Run-time dependency cmocka found: YES 1.1.7 00:02:22.119 Program pytest-3 found: NO 00:02:22.119 Program flake8 found: NO 00:02:22.119 Program misspell-fixer found: NO 00:02:22.119 Program restructuredtext-lint found: NO 00:02:22.119 Program valgrind found: YES (/usr/bin/valgrind) 00:02:22.119 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:22.119 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:22.119 Compiler for C supports arguments -Wwrite-strings: YES 00:02:22.119 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:22.119 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:22.119 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:22.119 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:22.119 Build targets in project: 8 00:02:22.119 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:22.119 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:22.119 00:02:22.119 libvfio-user 0.0.1 00:02:22.119 00:02:22.119 User defined options 00:02:22.119 buildtype : debug 00:02:22.119 default_library: static 00:02:22.119 libdir : /usr/local/lib 00:02:22.119 00:02:22.119 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:22.688 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:22.688 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:22.688 [2/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:22.688 [3/36] Compiling C object samples/null.p/null.c.o 00:02:22.688 [4/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:22.688 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:22.688 [6/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:22.688 [7/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:22.688 [8/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:22.688 [9/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:22.688 [10/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:22.688 [11/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:22.688 [12/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:22.688 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:22.688 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:22.688 [15/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:22.688 [16/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:22.688 [17/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:22.688 [18/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:22.688 [19/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:22.688 [20/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:22.688 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:22.688 [22/36] Compiling C object samples/server.p/server.c.o 00:02:22.688 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:22.688 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:22.688 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:22.688 [26/36] Compiling C object samples/client.p/client.c.o 00:02:22.688 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:22.688 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:22.688 [29/36] Linking static target lib/libvfio-user.a 00:02:22.688 [30/36] Linking target samples/client 00:02:22.688 [31/36] Linking target samples/server 00:02:22.688 [32/36] Linking target samples/null 00:02:22.688 [33/36] Linking target samples/shadow_ioeventfd_server 00:02:22.688 [34/36] Linking target test/unit_tests 00:02:22.688 [35/36] Linking target samples/gpio-pci-idio-16 00:02:22.688 [36/36] Linking target samples/lspci 00:02:22.947 INFO: autodetecting backend as ninja 00:02:22.947 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:22.947 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:23.206 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:23.206 ninja: no work to do. 00:02:28.488 The Meson build system 00:02:28.488 Version: 1.5.0 00:02:28.488 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:02:28.488 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:02:28.488 Build type: native build 00:02:28.488 Program cat found: YES (/usr/bin/cat) 00:02:28.488 Project name: DPDK 00:02:28.488 Project version: 23.11.0 00:02:28.488 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:28.488 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:28.488 Host machine cpu family: x86_64 00:02:28.488 Host machine cpu: x86_64 00:02:28.488 Message: ## Building in Developer Mode ## 00:02:28.488 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:28.488 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:28.488 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:28.488 Program python3 found: YES (/usr/bin/python3) 00:02:28.488 Program cat found: YES (/usr/bin/cat) 00:02:28.488 Compiler for C supports arguments -march=native: YES 00:02:28.488 Checking for size of "void *" : 8 00:02:28.488 Checking for size of "void *" : 8 (cached) 00:02:28.488 Library m found: YES 00:02:28.488 Library numa found: YES 00:02:28.488 Has header "numaif.h" : YES 00:02:28.488 Library fdt found: NO 00:02:28.488 Library execinfo found: NO 00:02:28.488 Has header "execinfo.h" : YES 00:02:28.488 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:28.488 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:28.488 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:28.488 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:28.488 Run-time dependency openssl found: YES 3.1.1 00:02:28.488 Run-time dependency libpcap found: YES 1.10.4 00:02:28.488 Has header "pcap.h" with dependency libpcap: YES 00:02:28.488 Compiler for C supports arguments -Wcast-qual: YES 00:02:28.488 Compiler for C supports arguments -Wdeprecated: YES 00:02:28.488 Compiler for C supports arguments -Wformat: YES 00:02:28.488 Compiler for C supports arguments -Wformat-nonliteral: YES 00:02:28.488 Compiler for C supports arguments -Wformat-security: YES 00:02:28.488 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:28.488 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:28.488 Compiler for C supports arguments -Wnested-externs: YES 00:02:28.488 Compiler for C supports arguments -Wold-style-definition: YES 00:02:28.488 Compiler for C supports arguments -Wpointer-arith: YES 00:02:28.488 Compiler for C supports arguments -Wsign-compare: YES 00:02:28.488 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:28.488 Compiler for C supports arguments -Wundef: YES 00:02:28.488 Compiler for C supports arguments -Wwrite-strings: YES 00:02:28.488 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:28.488 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:02:28.488 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:28.488 Program objdump found: YES (/usr/bin/objdump) 00:02:28.488 Compiler for C supports arguments -mavx512f: YES 00:02:28.488 Checking if "AVX512 checking" compiles: YES 00:02:28.488 Fetching value of define "__SSE4_2__" : 1 00:02:28.488 Fetching value of define "__AES__" : 1 00:02:28.488 Fetching value of define "__AVX__" : 1 00:02:28.488 Fetching value of define "__AVX2__" : 1 00:02:28.488 Fetching value of define "__AVX512BW__" : 1 00:02:28.488 Fetching value of define "__AVX512CD__" : 1 00:02:28.488 Fetching value of define "__AVX512DQ__" : 1 00:02:28.488 Fetching value of define "__AVX512F__" : 1 00:02:28.488 Fetching value of define "__AVX512VL__" : 1 00:02:28.488 Fetching value of define "__PCLMUL__" : 1 00:02:28.488 Fetching value of define "__RDRND__" : 1 00:02:28.488 Fetching value of define "__RDSEED__" : 1 00:02:28.488 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:28.488 Fetching value of define "__znver1__" : (undefined) 00:02:28.488 Fetching value of define "__znver2__" : (undefined) 00:02:28.488 Fetching value of define "__znver3__" : (undefined) 00:02:28.488 Fetching value of define "__znver4__" : (undefined) 00:02:28.488 Compiler for C supports arguments -Wno-format-truncation: NO 00:02:28.488 Message: lib/log: Defining dependency "log" 00:02:28.488 Message: lib/kvargs: Defining dependency "kvargs" 00:02:28.488 Message: lib/telemetry: Defining dependency "telemetry" 00:02:28.488 Checking for function "getentropy" : NO 00:02:28.488 Message: lib/eal: Defining dependency "eal" 00:02:28.488 Message: lib/ring: Defining dependency "ring" 00:02:28.488 Message: lib/rcu: Defining dependency "rcu" 00:02:28.488 Message: lib/mempool: Defining dependency "mempool" 00:02:28.488 Message: lib/mbuf: Defining dependency "mbuf" 00:02:28.488 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:28.488 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:28.488 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:28.488 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:28.488 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:28.488 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:28.488 Compiler for C supports arguments -mpclmul: YES 00:02:28.488 Compiler for C supports arguments -maes: YES 00:02:28.488 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:28.488 Compiler for C supports arguments -mavx512bw: YES 00:02:28.488 Compiler for C supports arguments -mavx512dq: YES 00:02:28.488 Compiler for C supports arguments -mavx512vl: YES 00:02:28.488 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:28.488 Compiler for C supports arguments -mavx2: YES 00:02:28.488 Compiler for C supports arguments -mavx: YES 00:02:28.488 Message: lib/net: Defining dependency "net" 00:02:28.488 Message: lib/meter: Defining dependency "meter" 00:02:28.488 Message: lib/ethdev: Defining dependency "ethdev" 00:02:28.488 Message: lib/pci: Defining dependency "pci" 00:02:28.488 Message: lib/cmdline: Defining dependency "cmdline" 00:02:28.488 Message: lib/hash: Defining dependency "hash" 00:02:28.488 Message: lib/timer: Defining dependency "timer" 00:02:28.488 Message: lib/compressdev: Defining dependency "compressdev" 00:02:28.488 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:28.488 Message: lib/dmadev: Defining dependency "dmadev" 00:02:28.488 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:28.488 Message: lib/power: Defining dependency "power" 00:02:28.488 Message: lib/reorder: Defining dependency "reorder" 00:02:28.488 Message: lib/security: Defining dependency "security" 00:02:28.488 Has header "linux/userfaultfd.h" : YES 00:02:28.488 Has header "linux/vduse.h" : YES 00:02:28.488 Message: lib/vhost: Defining dependency "vhost" 00:02:28.488 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:02:28.488 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:28.488 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:28.488 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:28.488 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:28.488 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:28.488 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:28.488 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:28.488 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:28.488 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:28.488 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:28.488 Configuring doxy-api-html.conf using configuration 00:02:28.489 Configuring doxy-api-man.conf using configuration 00:02:28.489 Program mandb found: YES (/usr/bin/mandb) 00:02:28.489 Program sphinx-build found: NO 00:02:28.489 Configuring rte_build_config.h using configuration 00:02:28.489 Message: 00:02:28.489 ================= 00:02:28.489 Applications Enabled 00:02:28.489 ================= 00:02:28.489 00:02:28.489 apps: 00:02:28.489 00:02:28.489 00:02:28.489 Message: 00:02:28.489 ================= 00:02:28.489 Libraries Enabled 00:02:28.489 ================= 00:02:28.489 00:02:28.489 libs: 00:02:28.489 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:28.489 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:28.489 cryptodev, dmadev, power, reorder, security, vhost, 00:02:28.489 00:02:28.489 Message: 00:02:28.489 =============== 00:02:28.489 Drivers Enabled 00:02:28.489 =============== 00:02:28.489 00:02:28.489 common: 00:02:28.489 00:02:28.489 bus: 00:02:28.489 pci, vdev, 00:02:28.489 mempool: 00:02:28.489 ring, 00:02:28.489 dma: 00:02:28.489 00:02:28.489 net: 00:02:28.489 00:02:28.489 crypto: 00:02:28.489 00:02:28.489 compress: 00:02:28.489 00:02:28.489 vdpa: 00:02:28.489 00:02:28.489 00:02:28.489 Message: 00:02:28.489 ================= 00:02:28.489 Content Skipped 00:02:28.489 ================= 00:02:28.489 00:02:28.489 apps: 00:02:28.489 dumpcap: explicitly disabled via build config 00:02:28.489 graph: explicitly disabled via build config 00:02:28.489 pdump: explicitly disabled via build config 00:02:28.489 proc-info: explicitly disabled via build config 00:02:28.489 test-acl: explicitly disabled via build config 00:02:28.489 test-bbdev: explicitly disabled via build config 00:02:28.489 test-cmdline: explicitly disabled via build config 00:02:28.489 test-compress-perf: explicitly disabled via build config 00:02:28.489 test-crypto-perf: explicitly disabled via build config 00:02:28.489 test-dma-perf: explicitly disabled via build config 00:02:28.489 test-eventdev: explicitly disabled via build config 00:02:28.489 test-fib: explicitly disabled via build config 00:02:28.489 test-flow-perf: explicitly disabled via build config 00:02:28.489 test-gpudev: explicitly disabled via build config 00:02:28.489 test-mldev: explicitly disabled via build config 00:02:28.489 test-pipeline: explicitly disabled via build config 00:02:28.489 test-pmd: explicitly disabled via build config 00:02:28.489 test-regex: explicitly disabled via build config 00:02:28.489 test-sad: explicitly disabled via build config 00:02:28.489 test-security-perf: explicitly disabled via build config 00:02:28.489 00:02:28.489 libs: 00:02:28.489 metrics: explicitly disabled via build config 00:02:28.489 acl: explicitly disabled via build config 00:02:28.489 bbdev: explicitly disabled via build config 00:02:28.489 bitratestats: explicitly disabled via build config 00:02:28.489 bpf: explicitly disabled via build config 00:02:28.489 cfgfile: explicitly disabled via build config 00:02:28.489 distributor: explicitly disabled via build config 00:02:28.489 efd: explicitly disabled via build config 00:02:28.489 eventdev: explicitly disabled via build config 00:02:28.489 dispatcher: explicitly disabled via build config 00:02:28.489 gpudev: explicitly disabled via build config 00:02:28.489 gro: explicitly disabled via build config 00:02:28.489 gso: explicitly disabled via build config 00:02:28.489 ip_frag: explicitly disabled via build config 00:02:28.489 jobstats: explicitly disabled via build config 00:02:28.489 latencystats: explicitly disabled via build config 00:02:28.489 lpm: explicitly disabled via build config 00:02:28.489 member: explicitly disabled via build config 00:02:28.489 pcapng: explicitly disabled via build config 00:02:28.489 rawdev: explicitly disabled via build config 00:02:28.489 regexdev: explicitly disabled via build config 00:02:28.489 mldev: explicitly disabled via build config 00:02:28.489 rib: explicitly disabled via build config 00:02:28.489 sched: explicitly disabled via build config 00:02:28.489 stack: explicitly disabled via build config 00:02:28.489 ipsec: explicitly disabled via build config 00:02:28.489 pdcp: explicitly disabled via build config 00:02:28.489 fib: explicitly disabled via build config 00:02:28.489 port: explicitly disabled via build config 00:02:28.489 pdump: explicitly disabled via build config 00:02:28.489 table: explicitly disabled via build config 00:02:28.489 pipeline: explicitly disabled via build config 00:02:28.489 graph: explicitly disabled via build config 00:02:28.489 node: explicitly disabled via build config 00:02:28.489 00:02:28.489 drivers: 00:02:28.489 common/cpt: not in enabled drivers build config 00:02:28.489 common/dpaax: not in enabled drivers build config 00:02:28.489 common/iavf: not in enabled drivers build config 00:02:28.489 common/idpf: not in enabled drivers build config 00:02:28.489 common/mvep: not in enabled drivers build config 00:02:28.489 common/octeontx: not in enabled drivers build config 00:02:28.489 bus/auxiliary: not in enabled drivers build config 00:02:28.489 bus/cdx: not in enabled drivers build config 00:02:28.489 bus/dpaa: not in enabled drivers build config 00:02:28.489 bus/fslmc: not in enabled drivers build config 00:02:28.489 bus/ifpga: not in enabled drivers build config 00:02:28.489 bus/platform: not in enabled drivers build config 00:02:28.489 bus/vmbus: not in enabled drivers build config 00:02:28.489 common/cnxk: not in enabled drivers build config 00:02:28.489 common/mlx5: not in enabled drivers build config 00:02:28.489 common/nfp: not in enabled drivers build config 00:02:28.489 common/qat: not in enabled drivers build config 00:02:28.489 common/sfc_efx: not in enabled drivers build config 00:02:28.489 mempool/bucket: not in enabled drivers build config 00:02:28.489 mempool/cnxk: not in enabled drivers build config 00:02:28.489 mempool/dpaa: not in enabled drivers build config 00:02:28.489 mempool/dpaa2: not in enabled drivers build config 00:02:28.489 mempool/octeontx: not in enabled drivers build config 00:02:28.489 mempool/stack: not in enabled drivers build config 00:02:28.489 dma/cnxk: not in enabled drivers build config 00:02:28.489 dma/dpaa: not in enabled drivers build config 00:02:28.489 dma/dpaa2: not in enabled drivers build config 00:02:28.489 dma/hisilicon: not in enabled drivers build config 00:02:28.489 dma/idxd: not in enabled drivers build config 00:02:28.489 dma/ioat: not in enabled drivers build config 00:02:28.489 dma/skeleton: not in enabled drivers build config 00:02:28.489 net/af_packet: not in enabled drivers build config 00:02:28.489 net/af_xdp: not in enabled drivers build config 00:02:28.489 net/ark: not in enabled drivers build config 00:02:28.489 net/atlantic: not in enabled drivers build config 00:02:28.489 net/avp: not in enabled drivers build config 00:02:28.489 net/axgbe: not in enabled drivers build config 00:02:28.489 net/bnx2x: not in enabled drivers build config 00:02:28.489 net/bnxt: not in enabled drivers build config 00:02:28.489 net/bonding: not in enabled drivers build config 00:02:28.489 net/cnxk: not in enabled drivers build config 00:02:28.489 net/cpfl: not in enabled drivers build config 00:02:28.489 net/cxgbe: not in enabled drivers build config 00:02:28.489 net/dpaa: not in enabled drivers build config 00:02:28.489 net/dpaa2: not in enabled drivers build config 00:02:28.489 net/e1000: not in enabled drivers build config 00:02:28.489 net/ena: not in enabled drivers build config 00:02:28.489 net/enetc: not in enabled drivers build config 00:02:28.489 net/enetfec: not in enabled drivers build config 00:02:28.489 net/enic: not in enabled drivers build config 00:02:28.489 net/failsafe: not in enabled drivers build config 00:02:28.489 net/fm10k: not in enabled drivers build config 00:02:28.489 net/gve: not in enabled drivers build config 00:02:28.489 net/hinic: not in enabled drivers build config 00:02:28.489 net/hns3: not in enabled drivers build config 00:02:28.489 net/i40e: not in enabled drivers build config 00:02:28.489 net/iavf: not in enabled drivers build config 00:02:28.489 net/ice: not in enabled drivers build config 00:02:28.489 net/idpf: not in enabled drivers build config 00:02:28.489 net/igc: not in enabled drivers build config 00:02:28.489 net/ionic: not in enabled drivers build config 00:02:28.489 net/ipn3ke: not in enabled drivers build config 00:02:28.489 net/ixgbe: not in enabled drivers build config 00:02:28.489 net/mana: not in enabled drivers build config 00:02:28.489 net/memif: not in enabled drivers build config 00:02:28.489 net/mlx4: not in enabled drivers build config 00:02:28.489 net/mlx5: not in enabled drivers build config 00:02:28.489 net/mvneta: not in enabled drivers build config 00:02:28.489 net/mvpp2: not in enabled drivers build config 00:02:28.489 net/netvsc: not in enabled drivers build config 00:02:28.489 net/nfb: not in enabled drivers build config 00:02:28.489 net/nfp: not in enabled drivers build config 00:02:28.489 net/ngbe: not in enabled drivers build config 00:02:28.489 net/null: not in enabled drivers build config 00:02:28.489 net/octeontx: not in enabled drivers build config 00:02:28.489 net/octeon_ep: not in enabled drivers build config 00:02:28.489 net/pcap: not in enabled drivers build config 00:02:28.489 net/pfe: not in enabled drivers build config 00:02:28.489 net/qede: not in enabled drivers build config 00:02:28.489 net/ring: not in enabled drivers build config 00:02:28.489 net/sfc: not in enabled drivers build config 00:02:28.489 net/softnic: not in enabled drivers build config 00:02:28.489 net/tap: not in enabled drivers build config 00:02:28.489 net/thunderx: not in enabled drivers build config 00:02:28.489 net/txgbe: not in enabled drivers build config 00:02:28.489 net/vdev_netvsc: not in enabled drivers build config 00:02:28.489 net/vhost: not in enabled drivers build config 00:02:28.489 net/virtio: not in enabled drivers build config 00:02:28.489 net/vmxnet3: not in enabled drivers build config 00:02:28.489 raw/*: missing internal dependency, "rawdev" 00:02:28.489 crypto/armv8: not in enabled drivers build config 00:02:28.489 crypto/bcmfs: not in enabled drivers build config 00:02:28.489 crypto/caam_jr: not in enabled drivers build config 00:02:28.489 crypto/ccp: not in enabled drivers build config 00:02:28.489 crypto/cnxk: not in enabled drivers build config 00:02:28.490 crypto/dpaa_sec: not in enabled drivers build config 00:02:28.490 crypto/dpaa2_sec: not in enabled drivers build config 00:02:28.490 crypto/ipsec_mb: not in enabled drivers build config 00:02:28.490 crypto/mlx5: not in enabled drivers build config 00:02:28.490 crypto/mvsam: not in enabled drivers build config 00:02:28.490 crypto/nitrox: not in enabled drivers build config 00:02:28.490 crypto/null: not in enabled drivers build config 00:02:28.490 crypto/octeontx: not in enabled drivers build config 00:02:28.490 crypto/openssl: not in enabled drivers build config 00:02:28.490 crypto/scheduler: not in enabled drivers build config 00:02:28.490 crypto/uadk: not in enabled drivers build config 00:02:28.490 crypto/virtio: not in enabled drivers build config 00:02:28.490 compress/isal: not in enabled drivers build config 00:02:28.490 compress/mlx5: not in enabled drivers build config 00:02:28.490 compress/octeontx: not in enabled drivers build config 00:02:28.490 compress/zlib: not in enabled drivers build config 00:02:28.490 regex/*: missing internal dependency, "regexdev" 00:02:28.490 ml/*: missing internal dependency, "mldev" 00:02:28.490 vdpa/ifc: not in enabled drivers build config 00:02:28.490 vdpa/mlx5: not in enabled drivers build config 00:02:28.490 vdpa/nfp: not in enabled drivers build config 00:02:28.490 vdpa/sfc: not in enabled drivers build config 00:02:28.490 event/*: missing internal dependency, "eventdev" 00:02:28.490 baseband/*: missing internal dependency, "bbdev" 00:02:28.490 gpu/*: missing internal dependency, "gpudev" 00:02:28.490 00:02:28.490 00:02:28.490 Build targets in project: 85 00:02:28.490 00:02:28.490 DPDK 23.11.0 00:02:28.490 00:02:28.490 User defined options 00:02:28.490 buildtype : debug 00:02:28.490 default_library : static 00:02:28.490 libdir : lib 00:02:28.490 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:02:28.490 c_args : -fPIC -Werror 00:02:28.490 c_link_args : 00:02:28.490 cpu_instruction_set: native 00:02:28.490 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:28.490 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,pcapng,bbdev 00:02:28.490 enable_docs : false 00:02:28.490 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:02:28.490 enable_kmods : false 00:02:28.490 tests : false 00:02:28.490 00:02:28.490 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:28.490 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:02:28.490 [1/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:28.490 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:28.490 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:28.490 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:28.490 [5/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:28.490 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:28.490 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:28.490 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:28.490 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:28.490 [10/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:28.490 [11/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:28.490 [12/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:28.490 [13/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:28.490 [14/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:28.490 [15/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:28.490 [16/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:28.490 [17/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:28.490 [18/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:28.490 [19/265] Linking static target lib/librte_kvargs.a 00:02:28.490 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:28.490 [21/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:28.490 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:28.490 [23/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:28.490 [24/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:28.490 [25/265] Linking static target lib/librte_log.a 00:02:28.490 [26/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:28.490 [27/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:28.490 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:28.490 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:28.490 [30/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:28.490 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:28.490 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:28.490 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:28.490 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:28.490 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:28.490 [36/265] Linking static target lib/librte_pci.a 00:02:28.490 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:28.751 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:28.751 [39/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:28.751 [40/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:28.751 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:28.751 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.751 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.011 [44/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:29.011 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:29.011 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:29.011 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:29.011 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:29.011 [49/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:29.011 [50/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:29.011 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:29.011 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:29.011 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:29.011 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:29.011 [55/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:29.011 [56/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:29.011 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:29.011 [58/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:29.011 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:29.011 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:29.011 [61/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:29.011 [62/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:29.011 [63/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:29.011 [64/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:29.011 [65/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:29.011 [66/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:29.011 [67/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:29.011 [68/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:29.011 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:29.011 [70/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:29.011 [71/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:29.011 [72/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:29.011 [73/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:29.011 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:29.011 [75/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:29.011 [76/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:29.011 [77/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:29.011 [78/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:29.011 [79/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:29.011 [80/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:29.011 [81/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:29.011 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:29.011 [83/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:29.011 [84/265] Linking static target lib/librte_telemetry.a 00:02:29.011 [85/265] Linking static target lib/librte_meter.a 00:02:29.011 [86/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:29.011 [87/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:29.011 [88/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:29.011 [89/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:29.011 [90/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:29.011 [91/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:29.011 [92/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:29.011 [93/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:29.011 [94/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:29.011 [95/265] Linking static target lib/librte_ring.a 00:02:29.011 [96/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:29.011 [97/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:29.011 [98/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:29.011 [99/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:29.011 [100/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:29.011 [101/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:29.011 [102/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:29.011 [103/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:29.011 [104/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:29.011 [105/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:29.011 [106/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:29.011 [107/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:29.011 [108/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:29.011 [109/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:29.011 [110/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:29.011 [111/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:29.011 [112/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:29.012 [113/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:29.012 [114/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:29.012 [115/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:29.012 [116/265] Linking static target lib/librte_timer.a 00:02:29.012 [117/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:29.012 [118/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:29.012 [119/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:29.012 [120/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.012 [121/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:29.012 [122/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:29.012 [123/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:29.012 [124/265] Linking static target lib/librte_cmdline.a 00:02:29.271 [125/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:29.271 [126/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:29.271 [127/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:29.271 [128/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:29.271 [129/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:29.271 [130/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:29.271 [131/265] Linking static target lib/librte_net.a 00:02:29.271 [132/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:29.271 [133/265] Linking static target lib/librte_mempool.a 00:02:29.271 [134/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:29.271 [135/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:29.271 [136/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:29.271 [137/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:29.271 [138/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:29.271 [139/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:29.271 [140/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:29.271 [141/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:29.271 [142/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:29.271 [143/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:29.271 [144/265] Linking static target lib/librte_eal.a 00:02:29.271 [145/265] Linking target lib/librte_log.so.24.0 00:02:29.271 [146/265] Linking static target lib/librte_dmadev.a 00:02:29.271 [147/265] Linking static target lib/librte_compressdev.a 00:02:29.271 [148/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:29.271 [149/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:29.271 [150/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:29.271 [151/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:29.271 [152/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:29.271 [153/265] Linking static target lib/librte_rcu.a 00:02:29.271 [154/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:29.271 [155/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:29.271 [156/265] Linking static target lib/librte_mbuf.a 00:02:29.271 [157/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:29.271 [158/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:29.271 [159/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:29.271 [160/265] Linking static target lib/librte_power.a 00:02:29.271 [161/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:29.271 [162/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:29.271 [163/265] Linking static target lib/librte_reorder.a 00:02:29.271 [164/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:29.271 [165/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:29.271 [166/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:29.271 [167/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:29.271 [168/265] Linking static target lib/librte_hash.a 00:02:29.271 [169/265] Linking static target lib/librte_security.a 00:02:29.271 [170/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.271 [171/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:29.271 [172/265] Linking target lib/librte_kvargs.so.24.0 00:02:29.531 [173/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:29.531 [174/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:29.531 [175/265] Linking static target lib/librte_cryptodev.a 00:02:29.531 [176/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:29.531 [177/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:29.531 [178/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:29.531 [179/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:29.531 [180/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.531 [181/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:29.531 [182/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:29.531 [183/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:29.531 [184/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:29.531 [185/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:29.531 [186/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:29.531 [187/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:29.531 [188/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:29.531 [189/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:29.531 [190/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.531 [191/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:29.531 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:29.531 [193/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:29.531 [194/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.531 [195/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.531 [196/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:29.789 [197/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:29.789 [198/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:29.789 [199/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.789 [200/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:29.789 [201/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:29.789 [202/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:29.789 [203/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.789 [204/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:29.789 [205/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:29.789 [206/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:29.789 [207/265] Linking static target drivers/librte_bus_vdev.a 00:02:29.789 [208/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:29.789 [209/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:29.789 [210/265] Linking static target drivers/librte_bus_pci.a 00:02:29.789 [211/265] Linking static target drivers/librte_mempool_ring.a 00:02:29.789 [212/265] Linking static target lib/librte_ethdev.a 00:02:29.789 [213/265] Linking target lib/librte_telemetry.so.24.0 00:02:29.789 [214/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.789 [215/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:30.048 [216/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.048 [217/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.048 [218/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.048 [219/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.048 [220/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.307 [221/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.307 [222/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.307 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:30.566 [224/265] Linking static target lib/librte_vhost.a 00:02:30.566 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.566 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.504 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.882 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.447 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.981 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.981 [231/265] Linking target lib/librte_eal.so.24.0 00:02:42.239 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:42.239 [233/265] Linking target lib/librte_pci.so.24.0 00:02:42.239 [234/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:42.239 [235/265] Linking target lib/librte_timer.so.24.0 00:02:42.239 [236/265] Linking target lib/librte_ring.so.24.0 00:02:42.239 [237/265] Linking target lib/librte_meter.so.24.0 00:02:42.239 [238/265] Linking target lib/librte_dmadev.so.24.0 00:02:42.498 [239/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:42.498 [240/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:42.498 [241/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:42.498 [242/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:42.498 [243/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:42.498 [244/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:42.498 [245/265] Linking target lib/librte_rcu.so.24.0 00:02:42.498 [246/265] Linking target lib/librte_mempool.so.24.0 00:02:42.498 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:42.498 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:42.757 [249/265] Linking target lib/librte_mbuf.so.24.0 00:02:42.757 [250/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:42.757 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:42.757 [252/265] Linking target lib/librte_reorder.so.24.0 00:02:42.757 [253/265] Linking target lib/librte_compressdev.so.24.0 00:02:42.757 [254/265] Linking target lib/librte_net.so.24.0 00:02:42.757 [255/265] Linking target lib/librte_cryptodev.so.24.0 00:02:43.017 [256/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:43.017 [257/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:43.017 [258/265] Linking target lib/librte_hash.so.24.0 00:02:43.017 [259/265] Linking target lib/librte_cmdline.so.24.0 00:02:43.017 [260/265] Linking target lib/librte_security.so.24.0 00:02:43.017 [261/265] Linking target lib/librte_ethdev.so.24.0 00:02:43.276 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:43.276 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:43.276 [264/265] Linking target lib/librte_power.so.24.0 00:02:43.276 [265/265] Linking target lib/librte_vhost.so.24.0 00:02:43.276 INFO: autodetecting backend as ninja 00:02:43.276 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:02:44.217 CC lib/ut_mock/mock.o 00:02:44.217 CC lib/log/log_deprecated.o 00:02:44.217 CC lib/ut/ut.o 00:02:44.217 CC lib/log/log.o 00:02:44.217 CC lib/log/log_flags.o 00:02:44.479 LIB libspdk_ut_mock.a 00:02:44.479 LIB libspdk_ut.a 00:02:44.479 LIB libspdk_log.a 00:02:44.739 CC lib/ioat/ioat.o 00:02:44.739 CC lib/util/base64.o 00:02:44.739 CC lib/util/bit_array.o 00:02:44.739 CC lib/util/cpuset.o 00:02:44.739 CC lib/util/crc16.o 00:02:44.739 CC lib/util/crc32c.o 00:02:44.739 CC lib/util/crc32.o 00:02:44.739 CC lib/util/crc64.o 00:02:44.739 CC lib/util/crc32_ieee.o 00:02:44.739 CC lib/util/dif.o 00:02:44.739 CC lib/util/hexlify.o 00:02:44.739 CC lib/util/fd.o 00:02:44.739 CC lib/util/file.o 00:02:44.739 CC lib/util/iov.o 00:02:44.739 CC lib/dma/dma.o 00:02:44.739 CC lib/util/math.o 00:02:44.739 CC lib/util/pipe.o 00:02:44.739 CC lib/util/strerror_tls.o 00:02:44.739 CC lib/util/string.o 00:02:44.739 CC lib/util/uuid.o 00:02:44.739 CC lib/util/fd_group.o 00:02:44.739 CXX lib/trace_parser/trace.o 00:02:44.739 CC lib/util/xor.o 00:02:44.739 CC lib/util/zipf.o 00:02:44.739 CC lib/vfio_user/host/vfio_user_pci.o 00:02:44.739 CC lib/vfio_user/host/vfio_user.o 00:02:44.739 LIB libspdk_dma.a 00:02:44.998 LIB libspdk_ioat.a 00:02:44.998 LIB libspdk_vfio_user.a 00:02:44.998 LIB libspdk_util.a 00:02:45.256 LIB libspdk_trace_parser.a 00:02:45.256 CC lib/json/json_parse.o 00:02:45.256 CC lib/json/json_util.o 00:02:45.256 CC lib/vmd/led.o 00:02:45.256 CC lib/vmd/vmd.o 00:02:45.256 CC lib/json/json_write.o 00:02:45.256 CC lib/env_dpdk/env.o 00:02:45.256 CC lib/env_dpdk/memory.o 00:02:45.256 CC lib/env_dpdk/init.o 00:02:45.256 CC lib/env_dpdk/pci.o 00:02:45.256 CC lib/env_dpdk/threads.o 00:02:45.256 CC lib/env_dpdk/pci_ioat.o 00:02:45.256 CC lib/env_dpdk/pci_virtio.o 00:02:45.256 CC lib/env_dpdk/pci_vmd.o 00:02:45.256 CC lib/env_dpdk/pci_idxd.o 00:02:45.256 CC lib/env_dpdk/pci_event.o 00:02:45.256 CC lib/env_dpdk/sigbus_handler.o 00:02:45.256 CC lib/env_dpdk/pci_dpdk.o 00:02:45.256 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:45.256 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:45.256 CC lib/idxd/idxd_user.o 00:02:45.256 CC lib/idxd/idxd.o 00:02:45.256 CC lib/idxd/idxd_kernel.o 00:02:45.256 CC lib/rdma/common.o 00:02:45.256 CC lib/rdma/rdma_verbs.o 00:02:45.256 CC lib/conf/conf.o 00:02:45.514 LIB libspdk_conf.a 00:02:45.514 LIB libspdk_json.a 00:02:45.514 LIB libspdk_rdma.a 00:02:45.772 LIB libspdk_idxd.a 00:02:45.772 LIB libspdk_vmd.a 00:02:45.772 CC lib/jsonrpc/jsonrpc_server.o 00:02:45.772 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:45.772 CC lib/jsonrpc/jsonrpc_client.o 00:02:45.772 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:46.030 LIB libspdk_jsonrpc.a 00:02:46.289 CC lib/rpc/rpc.o 00:02:46.290 LIB libspdk_env_dpdk.a 00:02:46.290 LIB libspdk_rpc.a 00:02:46.549 CC lib/notify/notify.o 00:02:46.549 CC lib/notify/notify_rpc.o 00:02:46.549 CC lib/sock/sock.o 00:02:46.549 CC lib/sock/sock_rpc.o 00:02:46.549 CC lib/trace/trace.o 00:02:46.549 CC lib/trace/trace_flags.o 00:02:46.549 CC lib/trace/trace_rpc.o 00:02:46.808 LIB libspdk_notify.a 00:02:46.808 LIB libspdk_trace.a 00:02:46.808 LIB libspdk_sock.a 00:02:47.067 CC lib/thread/thread.o 00:02:47.067 CC lib/thread/iobuf.o 00:02:47.067 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:47.067 CC lib/nvme/nvme_ctrlr.o 00:02:47.068 CC lib/nvme/nvme_fabric.o 00:02:47.068 CC lib/nvme/nvme_ns_cmd.o 00:02:47.068 CC lib/nvme/nvme_ns.o 00:02:47.068 CC lib/nvme/nvme_pcie_common.o 00:02:47.068 CC lib/nvme/nvme_pcie.o 00:02:47.068 CC lib/nvme/nvme_qpair.o 00:02:47.068 CC lib/nvme/nvme_discovery.o 00:02:47.068 CC lib/nvme/nvme.o 00:02:47.068 CC lib/nvme/nvme_quirks.o 00:02:47.068 CC lib/nvme/nvme_transport.o 00:02:47.068 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:47.068 CC lib/nvme/nvme_tcp.o 00:02:47.068 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:47.068 CC lib/nvme/nvme_opal.o 00:02:47.068 CC lib/nvme/nvme_zns.o 00:02:47.068 CC lib/nvme/nvme_io_msg.o 00:02:47.068 CC lib/nvme/nvme_poll_group.o 00:02:47.068 CC lib/nvme/nvme_vfio_user.o 00:02:47.068 CC lib/nvme/nvme_cuse.o 00:02:47.068 CC lib/nvme/nvme_rdma.o 00:02:48.046 LIB libspdk_thread.a 00:02:48.046 CC lib/virtio/virtio.o 00:02:48.046 CC lib/virtio/virtio_vhost_user.o 00:02:48.046 CC lib/virtio/virtio_pci.o 00:02:48.046 CC lib/virtio/virtio_vfio_user.o 00:02:48.046 CC lib/init/json_config.o 00:02:48.046 CC lib/vfu_tgt/tgt_endpoint.o 00:02:48.046 CC lib/accel/accel.o 00:02:48.046 CC lib/vfu_tgt/tgt_rpc.o 00:02:48.046 CC lib/blob/blobstore.o 00:02:48.046 CC lib/accel/accel_rpc.o 00:02:48.046 CC lib/blob/request.o 00:02:48.046 CC lib/init/subsystem.o 00:02:48.046 CC lib/accel/accel_sw.o 00:02:48.046 CC lib/blob/zeroes.o 00:02:48.046 CC lib/init/rpc.o 00:02:48.046 CC lib/init/subsystem_rpc.o 00:02:48.046 CC lib/blob/blob_bs_dev.o 00:02:48.305 LIB libspdk_init.a 00:02:48.305 LIB libspdk_virtio.a 00:02:48.305 LIB libspdk_vfu_tgt.a 00:02:48.305 LIB libspdk_nvme.a 00:02:48.564 CC lib/event/log_rpc.o 00:02:48.564 CC lib/event/app.o 00:02:48.564 CC lib/event/reactor.o 00:02:48.564 CC lib/event/scheduler_static.o 00:02:48.564 CC lib/event/app_rpc.o 00:02:48.823 LIB libspdk_accel.a 00:02:48.823 LIB libspdk_event.a 00:02:49.082 CC lib/bdev/bdev.o 00:02:49.082 CC lib/bdev/part.o 00:02:49.082 CC lib/bdev/bdev_rpc.o 00:02:49.082 CC lib/bdev/bdev_zone.o 00:02:49.082 CC lib/bdev/scsi_nvme.o 00:02:49.650 LIB libspdk_blob.a 00:02:49.908 CC lib/lvol/lvol.o 00:02:49.908 CC lib/blobfs/blobfs.o 00:02:49.909 CC lib/blobfs/tree.o 00:02:50.168 LIB libspdk_lvol.a 00:02:50.427 LIB libspdk_blobfs.a 00:02:50.685 LIB libspdk_bdev.a 00:02:50.943 CC lib/nbd/nbd.o 00:02:50.943 CC lib/nbd/nbd_rpc.o 00:02:50.943 CC lib/ftl/ftl_core.o 00:02:50.943 CC lib/ftl/ftl_init.o 00:02:50.943 CC lib/ftl/ftl_layout.o 00:02:50.943 CC lib/ftl/ftl_debug.o 00:02:50.943 CC lib/ftl/ftl_sb.o 00:02:50.943 CC lib/ublk/ublk.o 00:02:50.943 CC lib/ftl/ftl_io.o 00:02:50.943 CC lib/ublk/ublk_rpc.o 00:02:50.943 CC lib/ftl/ftl_l2p.o 00:02:50.943 CC lib/ftl/ftl_l2p_flat.o 00:02:50.943 CC lib/ftl/ftl_nv_cache.o 00:02:50.943 CC lib/ftl/ftl_band.o 00:02:50.943 CC lib/ftl/ftl_band_ops.o 00:02:50.943 CC lib/ftl/ftl_writer.o 00:02:50.943 CC lib/nvmf/ctrlr_bdev.o 00:02:50.943 CC lib/nvmf/ctrlr.o 00:02:50.943 CC lib/ftl/ftl_rq.o 00:02:50.943 CC lib/nvmf/ctrlr_discovery.o 00:02:50.943 CC lib/ftl/ftl_reloc.o 00:02:50.944 CC lib/ftl/ftl_l2p_cache.o 00:02:50.944 CC lib/nvmf/subsystem.o 00:02:50.944 CC lib/ftl/ftl_p2l.o 00:02:50.944 CC lib/nvmf/nvmf.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt.o 00:02:50.944 CC lib/scsi/dev.o 00:02:50.944 CC lib/nvmf/nvmf_rpc.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:50.944 CC lib/scsi/lun.o 00:02:50.944 CC lib/nvmf/transport.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:50.944 CC lib/scsi/port.o 00:02:50.944 CC lib/nvmf/tcp.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:50.944 CC lib/scsi/scsi.o 00:02:50.944 CC lib/nvmf/vfio_user.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:50.944 CC lib/nvmf/rdma.o 00:02:50.944 CC lib/scsi/scsi_bdev.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:50.944 CC lib/scsi/scsi_pr.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:50.944 CC lib/scsi/scsi_rpc.o 00:02:50.944 CC lib/scsi/task.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:50.944 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:50.944 CC lib/ftl/utils/ftl_conf.o 00:02:50.944 CC lib/ftl/utils/ftl_md.o 00:02:50.944 CC lib/ftl/utils/ftl_mempool.o 00:02:50.944 CC lib/ftl/utils/ftl_property.o 00:02:50.944 CC lib/ftl/utils/ftl_bitmap.o 00:02:50.944 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:50.944 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:50.944 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:50.944 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:50.944 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:50.944 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:50.944 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:50.944 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:50.944 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:50.944 CC lib/ftl/base/ftl_base_dev.o 00:02:50.944 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:50.944 CC lib/ftl/ftl_trace.o 00:02:50.944 CC lib/ftl/base/ftl_base_bdev.o 00:02:51.203 LIB libspdk_nbd.a 00:02:51.462 LIB libspdk_scsi.a 00:02:51.462 LIB libspdk_ublk.a 00:02:51.721 LIB libspdk_ftl.a 00:02:51.721 CC lib/vhost/vhost.o 00:02:51.721 CC lib/vhost/rte_vhost_user.o 00:02:51.721 CC lib/vhost/vhost_rpc.o 00:02:51.721 CC lib/vhost/vhost_scsi.o 00:02:51.721 CC lib/vhost/vhost_blk.o 00:02:51.721 CC lib/iscsi/init_grp.o 00:02:51.721 CC lib/iscsi/conn.o 00:02:51.721 CC lib/iscsi/iscsi.o 00:02:51.721 CC lib/iscsi/portal_grp.o 00:02:51.721 CC lib/iscsi/md5.o 00:02:51.721 CC lib/iscsi/param.o 00:02:51.721 CC lib/iscsi/tgt_node.o 00:02:51.721 CC lib/iscsi/iscsi_subsystem.o 00:02:51.721 CC lib/iscsi/iscsi_rpc.o 00:02:51.721 CC lib/iscsi/task.o 00:02:52.290 LIB libspdk_nvmf.a 00:02:52.290 LIB libspdk_vhost.a 00:02:52.549 LIB libspdk_iscsi.a 00:02:52.807 CC module/env_dpdk/env_dpdk_rpc.o 00:02:52.807 CC module/vfu_device/vfu_virtio.o 00:02:52.807 CC module/vfu_device/vfu_virtio_scsi.o 00:02:52.807 CC module/vfu_device/vfu_virtio_blk.o 00:02:52.807 CC module/vfu_device/vfu_virtio_rpc.o 00:02:53.066 LIB libspdk_env_dpdk_rpc.a 00:02:53.066 CC module/sock/posix/posix.o 00:02:53.066 CC module/accel/error/accel_error_rpc.o 00:02:53.066 CC module/accel/error/accel_error.o 00:02:53.066 CC module/blob/bdev/blob_bdev.o 00:02:53.066 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:53.066 CC module/accel/dsa/accel_dsa.o 00:02:53.066 CC module/accel/ioat/accel_ioat.o 00:02:53.066 CC module/accel/dsa/accel_dsa_rpc.o 00:02:53.066 CC module/accel/ioat/accel_ioat_rpc.o 00:02:53.066 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:53.066 CC module/accel/iaa/accel_iaa.o 00:02:53.066 CC module/accel/iaa/accel_iaa_rpc.o 00:02:53.066 CC module/scheduler/gscheduler/gscheduler.o 00:02:53.066 LIB libspdk_scheduler_dpdk_governor.a 00:02:53.066 LIB libspdk_accel_error.a 00:02:53.066 LIB libspdk_scheduler_dynamic.a 00:02:53.066 LIB libspdk_scheduler_gscheduler.a 00:02:53.066 LIB libspdk_accel_ioat.a 00:02:53.066 LIB libspdk_accel_iaa.a 00:02:53.066 LIB libspdk_blob_bdev.a 00:02:53.325 LIB libspdk_accel_dsa.a 00:02:53.325 LIB libspdk_vfu_device.a 00:02:53.325 LIB libspdk_sock_posix.a 00:02:53.584 CC module/bdev/lvol/vbdev_lvol.o 00:02:53.584 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:53.584 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:53.584 CC module/blobfs/bdev/blobfs_bdev.o 00:02:53.584 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:53.584 CC module/bdev/passthru/vbdev_passthru.o 00:02:53.584 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:53.584 CC module/bdev/nvme/bdev_nvme.o 00:02:53.584 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:53.584 CC module/bdev/iscsi/bdev_iscsi.o 00:02:53.584 CC module/bdev/nvme/nvme_rpc.o 00:02:53.584 CC module/bdev/error/vbdev_error.o 00:02:53.584 CC module/bdev/error/vbdev_error_rpc.o 00:02:53.584 CC module/bdev/nvme/bdev_mdns_client.o 00:02:53.584 CC module/bdev/nvme/vbdev_opal.o 00:02:53.584 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:53.584 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:53.584 CC module/bdev/null/bdev_null.o 00:02:53.584 CC module/bdev/null/bdev_null_rpc.o 00:02:53.584 CC module/bdev/malloc/bdev_malloc.o 00:02:53.584 CC module/bdev/aio/bdev_aio_rpc.o 00:02:53.584 CC module/bdev/aio/bdev_aio.o 00:02:53.584 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:53.584 CC module/bdev/raid/bdev_raid.o 00:02:53.584 CC module/bdev/raid/bdev_raid_sb.o 00:02:53.584 CC module/bdev/raid/bdev_raid_rpc.o 00:02:53.584 CC module/bdev/raid/raid0.o 00:02:53.584 CC module/bdev/raid/concat.o 00:02:53.584 CC module/bdev/raid/raid1.o 00:02:53.584 CC module/bdev/delay/vbdev_delay.o 00:02:53.584 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:53.584 CC module/bdev/ftl/bdev_ftl.o 00:02:53.584 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:53.584 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:53.584 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:53.584 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:53.584 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:53.584 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:53.584 CC module/bdev/gpt/gpt.o 00:02:53.584 CC module/bdev/gpt/vbdev_gpt.o 00:02:53.584 CC module/bdev/split/vbdev_split.o 00:02:53.584 CC module/bdev/split/vbdev_split_rpc.o 00:02:53.843 LIB libspdk_blobfs_bdev.a 00:02:53.843 LIB libspdk_bdev_split.a 00:02:53.843 LIB libspdk_bdev_error.a 00:02:53.843 LIB libspdk_bdev_null.a 00:02:53.843 LIB libspdk_bdev_ftl.a 00:02:53.843 LIB libspdk_bdev_passthru.a 00:02:53.843 LIB libspdk_bdev_aio.a 00:02:53.843 LIB libspdk_bdev_iscsi.a 00:02:53.843 LIB libspdk_bdev_gpt.a 00:02:53.843 LIB libspdk_bdev_zone_block.a 00:02:53.843 LIB libspdk_bdev_delay.a 00:02:53.843 LIB libspdk_bdev_malloc.a 00:02:53.843 LIB libspdk_bdev_lvol.a 00:02:53.843 LIB libspdk_bdev_virtio.a 00:02:54.103 LIB libspdk_bdev_raid.a 00:02:54.673 LIB libspdk_bdev_nvme.a 00:02:55.241 CC module/event/subsystems/vmd/vmd.o 00:02:55.241 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:55.241 CC module/event/subsystems/iobuf/iobuf.o 00:02:55.241 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:55.241 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:55.241 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:55.241 CC module/event/subsystems/sock/sock.o 00:02:55.241 CC module/event/subsystems/scheduler/scheduler.o 00:02:55.500 LIB libspdk_event_vmd.a 00:02:55.500 LIB libspdk_event_vfu_tgt.a 00:02:55.500 LIB libspdk_event_vhost_blk.a 00:02:55.500 LIB libspdk_event_sock.a 00:02:55.500 LIB libspdk_event_iobuf.a 00:02:55.500 LIB libspdk_event_scheduler.a 00:02:55.759 CC module/event/subsystems/accel/accel.o 00:02:55.759 LIB libspdk_event_accel.a 00:02:56.325 CC module/event/subsystems/bdev/bdev.o 00:02:56.325 LIB libspdk_event_bdev.a 00:02:56.584 CC module/event/subsystems/scsi/scsi.o 00:02:56.584 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:56.584 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:56.584 CC module/event/subsystems/ublk/ublk.o 00:02:56.584 CC module/event/subsystems/nbd/nbd.o 00:02:56.584 LIB libspdk_event_scsi.a 00:02:56.584 LIB libspdk_event_ublk.a 00:02:56.584 LIB libspdk_event_nbd.a 00:02:56.843 LIB libspdk_event_nvmf.a 00:02:57.101 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:57.101 CC module/event/subsystems/iscsi/iscsi.o 00:02:57.101 LIB libspdk_event_vhost_scsi.a 00:02:57.101 LIB libspdk_event_iscsi.a 00:02:57.361 CC app/spdk_lspci/spdk_lspci.o 00:02:57.361 CC app/trace_record/trace_record.o 00:02:57.361 CXX app/trace/trace.o 00:02:57.361 CC test/rpc_client/rpc_client_test.o 00:02:57.361 CC app/spdk_nvme_identify/identify.o 00:02:57.361 CC app/spdk_nvme_perf/perf.o 00:02:57.361 TEST_HEADER include/spdk/accel.h 00:02:57.361 CC app/spdk_top/spdk_top.o 00:02:57.361 TEST_HEADER include/spdk/accel_module.h 00:02:57.361 TEST_HEADER include/spdk/assert.h 00:02:57.361 TEST_HEADER include/spdk/barrier.h 00:02:57.361 CC app/spdk_nvme_discover/discovery_aer.o 00:02:57.361 TEST_HEADER include/spdk/bdev_module.h 00:02:57.361 TEST_HEADER include/spdk/base64.h 00:02:57.361 TEST_HEADER include/spdk/bdev.h 00:02:57.361 TEST_HEADER include/spdk/bdev_zone.h 00:02:57.361 TEST_HEADER include/spdk/bit_array.h 00:02:57.361 TEST_HEADER include/spdk/bit_pool.h 00:02:57.361 TEST_HEADER include/spdk/blob_bdev.h 00:02:57.361 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:57.361 TEST_HEADER include/spdk/blobfs.h 00:02:57.361 TEST_HEADER include/spdk/blob.h 00:02:57.361 TEST_HEADER include/spdk/conf.h 00:02:57.361 TEST_HEADER include/spdk/config.h 00:02:57.361 TEST_HEADER include/spdk/cpuset.h 00:02:57.361 TEST_HEADER include/spdk/crc16.h 00:02:57.361 TEST_HEADER include/spdk/crc64.h 00:02:57.361 TEST_HEADER include/spdk/crc32.h 00:02:57.361 TEST_HEADER include/spdk/dif.h 00:02:57.361 TEST_HEADER include/spdk/dma.h 00:02:57.361 TEST_HEADER include/spdk/endian.h 00:02:57.361 TEST_HEADER include/spdk/env_dpdk.h 00:02:57.361 TEST_HEADER include/spdk/env.h 00:02:57.361 TEST_HEADER include/spdk/fd_group.h 00:02:57.361 TEST_HEADER include/spdk/event.h 00:02:57.361 TEST_HEADER include/spdk/fd.h 00:02:57.361 TEST_HEADER include/spdk/file.h 00:02:57.361 TEST_HEADER include/spdk/gpt_spec.h 00:02:57.361 TEST_HEADER include/spdk/ftl.h 00:02:57.361 TEST_HEADER include/spdk/hexlify.h 00:02:57.361 TEST_HEADER include/spdk/histogram_data.h 00:02:57.361 TEST_HEADER include/spdk/idxd_spec.h 00:02:57.361 TEST_HEADER include/spdk/init.h 00:02:57.361 TEST_HEADER include/spdk/idxd.h 00:02:57.361 TEST_HEADER include/spdk/ioat_spec.h 00:02:57.361 TEST_HEADER include/spdk/ioat.h 00:02:57.361 TEST_HEADER include/spdk/iscsi_spec.h 00:02:57.361 TEST_HEADER include/spdk/json.h 00:02:57.361 TEST_HEADER include/spdk/jsonrpc.h 00:02:57.626 TEST_HEADER include/spdk/likely.h 00:02:57.626 TEST_HEADER include/spdk/lvol.h 00:02:57.626 TEST_HEADER include/spdk/log.h 00:02:57.626 TEST_HEADER include/spdk/memory.h 00:02:57.626 TEST_HEADER include/spdk/mmio.h 00:02:57.626 TEST_HEADER include/spdk/nbd.h 00:02:57.626 TEST_HEADER include/spdk/notify.h 00:02:57.626 TEST_HEADER include/spdk/nvme_intel.h 00:02:57.626 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:57.626 TEST_HEADER include/spdk/nvme.h 00:02:57.626 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:57.626 TEST_HEADER include/spdk/nvme_spec.h 00:02:57.626 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:57.626 TEST_HEADER include/spdk/nvme_zns.h 00:02:57.626 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:57.626 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:57.626 TEST_HEADER include/spdk/nvmf.h 00:02:57.626 TEST_HEADER include/spdk/nvmf_transport.h 00:02:57.626 TEST_HEADER include/spdk/nvmf_spec.h 00:02:57.626 TEST_HEADER include/spdk/opal.h 00:02:57.626 CC app/spdk_dd/spdk_dd.o 00:02:57.626 TEST_HEADER include/spdk/opal_spec.h 00:02:57.626 TEST_HEADER include/spdk/pci_ids.h 00:02:57.626 TEST_HEADER include/spdk/pipe.h 00:02:57.626 CC app/iscsi_tgt/iscsi_tgt.o 00:02:57.626 TEST_HEADER include/spdk/reduce.h 00:02:57.626 TEST_HEADER include/spdk/queue.h 00:02:57.626 TEST_HEADER include/spdk/rpc.h 00:02:57.626 TEST_HEADER include/spdk/scsi.h 00:02:57.626 TEST_HEADER include/spdk/scheduler.h 00:02:57.626 TEST_HEADER include/spdk/scsi_spec.h 00:02:57.626 CC app/nvmf_tgt/nvmf_main.o 00:02:57.626 TEST_HEADER include/spdk/sock.h 00:02:57.626 TEST_HEADER include/spdk/string.h 00:02:57.626 TEST_HEADER include/spdk/stdinc.h 00:02:57.626 TEST_HEADER include/spdk/thread.h 00:02:57.626 TEST_HEADER include/spdk/trace_parser.h 00:02:57.626 TEST_HEADER include/spdk/trace.h 00:02:57.626 TEST_HEADER include/spdk/tree.h 00:02:57.626 TEST_HEADER include/spdk/ublk.h 00:02:57.626 CC app/spdk_tgt/spdk_tgt.o 00:02:57.626 TEST_HEADER include/spdk/util.h 00:02:57.626 TEST_HEADER include/spdk/uuid.h 00:02:57.626 TEST_HEADER include/spdk/version.h 00:02:57.626 CC app/vhost/vhost.o 00:02:57.626 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:57.626 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:57.626 TEST_HEADER include/spdk/vhost.h 00:02:57.626 TEST_HEADER include/spdk/vmd.h 00:02:57.626 CXX test/cpp_headers/accel.o 00:02:57.626 TEST_HEADER include/spdk/zipf.h 00:02:57.626 TEST_HEADER include/spdk/xor.h 00:02:57.626 CXX test/cpp_headers/accel_module.o 00:02:57.626 CXX test/cpp_headers/assert.o 00:02:57.626 CXX test/cpp_headers/barrier.o 00:02:57.626 CXX test/cpp_headers/bdev.o 00:02:57.626 CXX test/cpp_headers/base64.o 00:02:57.626 CXX test/cpp_headers/bdev_module.o 00:02:57.626 CXX test/cpp_headers/bdev_zone.o 00:02:57.626 CXX test/cpp_headers/bit_array.o 00:02:57.626 CXX test/cpp_headers/bit_pool.o 00:02:57.626 CXX test/cpp_headers/blob_bdev.o 00:02:57.626 CXX test/cpp_headers/blobfs_bdev.o 00:02:57.626 CXX test/cpp_headers/blobfs.o 00:02:57.626 CXX test/cpp_headers/blob.o 00:02:57.626 CXX test/cpp_headers/conf.o 00:02:57.626 CXX test/cpp_headers/config.o 00:02:57.626 CXX test/cpp_headers/cpuset.o 00:02:57.626 CXX test/cpp_headers/crc16.o 00:02:57.626 CXX test/cpp_headers/crc32.o 00:02:57.626 CC test/nvme/reset/reset.o 00:02:57.626 CC test/nvme/aer/aer.o 00:02:57.626 CC test/nvme/reserve/reserve.o 00:02:57.626 CXX test/cpp_headers/crc64.o 00:02:57.626 CXX test/cpp_headers/dma.o 00:02:57.626 CXX test/cpp_headers/dif.o 00:02:57.626 CXX test/cpp_headers/env_dpdk.o 00:02:57.626 CXX test/cpp_headers/endian.o 00:02:57.626 CXX test/cpp_headers/env.o 00:02:57.626 CC test/nvme/cuse/cuse.o 00:02:57.626 CXX test/cpp_headers/fd_group.o 00:02:57.626 CXX test/cpp_headers/event.o 00:02:57.626 CC test/nvme/overhead/overhead.o 00:02:57.626 CC test/nvme/e2edp/nvme_dp.o 00:02:57.626 CXX test/cpp_headers/gpt_spec.o 00:02:57.626 CXX test/cpp_headers/fd.o 00:02:57.626 CXX test/cpp_headers/file.o 00:02:57.626 CXX test/cpp_headers/ftl.o 00:02:57.626 CXX test/cpp_headers/hexlify.o 00:02:57.626 CC test/nvme/err_injection/err_injection.o 00:02:57.626 CXX test/cpp_headers/idxd.o 00:02:57.626 CXX test/cpp_headers/histogram_data.o 00:02:57.626 CC test/app/jsoncat/jsoncat.o 00:02:57.626 CXX test/cpp_headers/idxd_spec.o 00:02:57.626 CC test/nvme/startup/startup.o 00:02:57.626 CC test/env/vtophys/vtophys.o 00:02:57.626 CC test/env/memory/memory_ut.o 00:02:57.626 CC test/nvme/fused_ordering/fused_ordering.o 00:02:57.626 CC test/nvme/simple_copy/simple_copy.o 00:02:57.626 CC examples/nvme/reconnect/reconnect.o 00:02:57.626 CC test/nvme/boot_partition/boot_partition.o 00:02:57.626 CC test/app/histogram_perf/histogram_perf.o 00:02:57.626 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:57.626 CC examples/nvme/abort/abort.o 00:02:57.626 CC test/env/pci/pci_ut.o 00:02:57.626 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:57.626 CC test/nvme/sgl/sgl.o 00:02:57.626 CC examples/sock/hello_world/hello_sock.o 00:02:57.626 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:57.626 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:57.626 CC test/nvme/fdp/fdp.o 00:02:57.626 CC test/event/event_perf/event_perf.o 00:02:57.626 CC test/thread/poller_perf/poller_perf.o 00:02:57.626 CC test/thread/lock/spdk_lock.o 00:02:57.626 CC examples/vmd/lsvmd/lsvmd.o 00:02:57.626 CC examples/nvme/hello_world/hello_world.o 00:02:57.626 CC examples/idxd/perf/perf.o 00:02:57.626 CC examples/nvme/arbitration/arbitration.o 00:02:57.626 CC test/nvme/compliance/nvme_compliance.o 00:02:57.626 CC test/accel/dif/dif.o 00:02:57.626 CC examples/ioat/perf/perf.o 00:02:57.626 CC test/nvme/connect_stress/connect_stress.o 00:02:57.626 CC examples/vmd/led/led.o 00:02:57.626 CC examples/util/zipf/zipf.o 00:02:57.626 CC test/app/stub/stub.o 00:02:57.626 CC app/fio/nvme/fio_plugin.o 00:02:57.626 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:57.626 CC examples/nvme/hotplug/hotplug.o 00:02:57.626 CC test/dma/test_dma/test_dma.o 00:02:57.626 CC test/event/reactor/reactor.o 00:02:57.626 CC test/event/reactor_perf/reactor_perf.o 00:02:57.626 CC examples/accel/perf/accel_perf.o 00:02:57.626 CC examples/ioat/verify/verify.o 00:02:57.626 CC test/event/app_repeat/app_repeat.o 00:02:57.626 CXX test/cpp_headers/init.o 00:02:57.626 CC examples/bdev/hello_world/hello_bdev.o 00:02:57.626 LINK spdk_lspci 00:02:57.626 CC test/bdev/bdevio/bdevio.o 00:02:57.626 CC examples/nvmf/nvmf/nvmf.o 00:02:57.626 CC test/blobfs/mkfs/mkfs.o 00:02:57.626 CC app/fio/bdev/fio_plugin.o 00:02:57.626 CC test/app/bdev_svc/bdev_svc.o 00:02:57.626 CC examples/bdev/bdevperf/bdevperf.o 00:02:57.626 CC examples/blob/hello_world/hello_blob.o 00:02:57.626 CC examples/thread/thread/thread_ex.o 00:02:57.626 CC examples/blob/cli/blobcli.o 00:02:57.626 LINK rpc_client_test 00:02:57.626 CC test/event/scheduler/scheduler.o 00:02:57.626 CC test/env/mem_callbacks/mem_callbacks.o 00:02:57.626 LINK spdk_nvme_discover 00:02:57.626 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:57.626 CC test/lvol/esnap/esnap.o 00:02:57.626 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:57.626 LINK spdk_trace_record 00:02:57.626 LINK interrupt_tgt 00:02:57.895 LINK jsoncat 00:02:57.895 LINK vtophys 00:02:57.895 CXX test/cpp_headers/ioat.o 00:02:57.895 LINK pmr_persistence 00:02:57.895 CXX test/cpp_headers/ioat_spec.o 00:02:57.895 CXX test/cpp_headers/iscsi_spec.o 00:02:57.895 CXX test/cpp_headers/json.o 00:02:57.895 CXX test/cpp_headers/jsonrpc.o 00:02:57.895 CXX test/cpp_headers/likely.o 00:02:57.895 LINK lsvmd 00:02:57.895 LINK histogram_perf 00:02:57.895 CXX test/cpp_headers/log.o 00:02:57.895 CXX test/cpp_headers/lvol.o 00:02:57.895 LINK led 00:02:57.895 LINK event_perf 00:02:57.895 CXX test/cpp_headers/memory.o 00:02:57.895 CXX test/cpp_headers/mmio.o 00:02:57.895 CXX test/cpp_headers/nbd.o 00:02:57.895 CXX test/cpp_headers/notify.o 00:02:57.895 CXX test/cpp_headers/nvme.o 00:02:57.895 LINK reactor 00:02:57.895 LINK poller_perf 00:02:57.895 CXX test/cpp_headers/nvme_intel.o 00:02:57.896 LINK env_dpdk_post_init 00:02:57.896 CXX test/cpp_headers/nvme_ocssd.o 00:02:57.896 LINK reactor_perf 00:02:57.896 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:57.896 CXX test/cpp_headers/nvme_spec.o 00:02:57.896 CXX test/cpp_headers/nvme_zns.o 00:02:57.896 CXX test/cpp_headers/nvmf_cmd.o 00:02:57.896 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:57.896 CXX test/cpp_headers/nvmf.o 00:02:57.896 CXX test/cpp_headers/nvmf_spec.o 00:02:57.896 LINK startup 00:02:57.896 LINK zipf 00:02:57.896 LINK nvmf_tgt 00:02:57.896 LINK iscsi_tgt 00:02:57.896 CXX test/cpp_headers/nvmf_transport.o 00:02:57.896 LINK reserve 00:02:57.896 CXX test/cpp_headers/opal.o 00:02:57.896 CXX test/cpp_headers/opal_spec.o 00:02:57.896 LINK spdk_tgt 00:02:57.896 LINK vhost 00:02:57.896 LINK boot_partition 00:02:57.896 LINK app_repeat 00:02:57.896 CXX test/cpp_headers/pci_ids.o 00:02:57.896 CXX test/cpp_headers/pipe.o 00:02:57.896 LINK connect_stress 00:02:57.896 CXX test/cpp_headers/queue.o 00:02:57.896 LINK fused_ordering 00:02:57.896 LINK err_injection 00:02:57.896 CXX test/cpp_headers/reduce.o 00:02:57.896 LINK doorbell_aers 00:02:57.896 LINK stub 00:02:57.896 CXX test/cpp_headers/rpc.o 00:02:57.896 LINK cmb_copy 00:02:57.896 CXX test/cpp_headers/scheduler.o 00:02:57.896 CXX test/cpp_headers/scsi.o 00:02:57.896 CXX test/cpp_headers/scsi_spec.o 00:02:57.896 LINK simple_copy 00:02:57.896 LINK ioat_perf 00:02:57.896 CXX test/cpp_headers/sock.o 00:02:57.896 LINK hotplug 00:02:57.896 LINK hello_sock 00:02:57.896 LINK mkfs 00:02:57.896 LINK hello_world 00:02:57.896 LINK reset 00:02:57.896 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:57.896 LINK nvme_dp 00:02:57.896 LINK aer 00:02:57.896 LINK bdev_svc 00:02:57.896 LINK overhead 00:02:57.896 LINK sgl 00:02:57.896 LINK hello_bdev 00:02:57.896 LINK verify 00:02:57.896 LINK fdp 00:02:57.896 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:57.896 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:57.896 LINK scheduler 00:02:57.896 LINK hello_blob 00:02:57.896 LINK thread 00:02:57.896 LINK spdk_trace 00:02:57.896 CXX test/cpp_headers/stdinc.o 00:02:57.896 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:57.896 CXX test/cpp_headers/string.o 00:02:57.896 CXX test/cpp_headers/thread.o 00:02:57.896 CXX test/cpp_headers/trace.o 00:02:57.896 CXX test/cpp_headers/trace_parser.o 00:02:57.896 CXX test/cpp_headers/tree.o 00:02:57.896 CXX test/cpp_headers/ublk.o 00:02:57.896 LINK idxd_perf 00:02:58.158 LINK nvmf 00:02:58.158 CXX test/cpp_headers/util.o 00:02:58.158 CXX test/cpp_headers/uuid.o 00:02:58.158 CXX test/cpp_headers/version.o 00:02:58.158 CXX test/cpp_headers/vfio_user_pci.o 00:02:58.158 CXX test/cpp_headers/vfio_user_spec.o 00:02:58.158 CXX test/cpp_headers/vhost.o 00:02:58.158 LINK reconnect 00:02:58.158 CXX test/cpp_headers/vmd.o 00:02:58.158 CXX test/cpp_headers/xor.o 00:02:58.158 CXX test/cpp_headers/zipf.o 00:02:58.158 LINK test_dma 00:02:58.158 LINK abort 00:02:58.158 LINK arbitration 00:02:58.158 LINK dif 00:02:58.158 LINK bdevio 00:02:58.158 LINK nvme_compliance 00:02:58.158 LINK accel_perf 00:02:58.158 LINK spdk_dd 00:02:58.158 LINK nvme_manage 00:02:58.158 LINK blobcli 00:02:58.158 LINK pci_ut 00:02:58.158 LINK nvme_fuzz 00:02:58.417 LINK llvm_vfio_fuzz 00:02:58.417 LINK mem_callbacks 00:02:58.417 LINK spdk_nvme_identify 00:02:58.417 LINK spdk_nvme 00:02:58.417 LINK spdk_bdev 00:02:58.417 LINK vhost_fuzz 00:02:58.417 LINK spdk_nvme_perf 00:02:58.675 LINK memory_ut 00:02:58.675 LINK spdk_top 00:02:58.675 LINK bdevperf 00:02:58.675 LINK cuse 00:02:58.934 LINK llvm_nvme_fuzz 00:02:58.934 LINK spdk_lock 00:02:59.501 LINK iscsi_fuzz 00:03:01.406 LINK esnap 00:03:01.666 00:03:01.666 real 0m41.251s 00:03:01.666 user 5m44.285s 00:03:01.666 sys 2m53.696s 00:03:01.666 19:01:20 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:03:01.666 19:01:20 -- common/autotest_common.sh@10 -- $ set +x 00:03:01.666 ************************************ 00:03:01.666 END TEST make 00:03:01.666 ************************************ 00:03:01.666 19:01:20 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:01.666 19:01:20 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:01.666 19:01:20 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:01.666 19:01:20 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:01.666 19:01:20 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:01.666 19:01:20 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:01.666 19:01:20 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:01.666 19:01:20 -- scripts/common.sh@335 -- # IFS=.-: 00:03:01.666 19:01:20 -- scripts/common.sh@335 -- # read -ra ver1 00:03:01.666 19:01:20 -- scripts/common.sh@336 -- # IFS=.-: 00:03:01.666 19:01:20 -- scripts/common.sh@336 -- # read -ra ver2 00:03:01.666 19:01:20 -- scripts/common.sh@337 -- # local 'op=<' 00:03:01.666 19:01:20 -- scripts/common.sh@339 -- # ver1_l=2 00:03:01.666 19:01:20 -- scripts/common.sh@340 -- # ver2_l=1 00:03:01.666 19:01:20 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:01.666 19:01:20 -- scripts/common.sh@343 -- # case "$op" in 00:03:01.666 19:01:20 -- scripts/common.sh@344 -- # : 1 00:03:01.666 19:01:20 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:01.666 19:01:20 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:01.666 19:01:20 -- scripts/common.sh@364 -- # decimal 1 00:03:01.666 19:01:20 -- scripts/common.sh@352 -- # local d=1 00:03:01.666 19:01:20 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:01.666 19:01:20 -- scripts/common.sh@354 -- # echo 1 00:03:01.666 19:01:20 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:01.666 19:01:20 -- scripts/common.sh@365 -- # decimal 2 00:03:01.666 19:01:20 -- scripts/common.sh@352 -- # local d=2 00:03:01.666 19:01:20 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:01.666 19:01:20 -- scripts/common.sh@354 -- # echo 2 00:03:01.666 19:01:20 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:01.666 19:01:20 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:01.666 19:01:20 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:01.666 19:01:20 -- scripts/common.sh@367 -- # return 0 00:03:01.666 19:01:20 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:01.666 19:01:20 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:01.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.666 --rc genhtml_branch_coverage=1 00:03:01.666 --rc genhtml_function_coverage=1 00:03:01.666 --rc genhtml_legend=1 00:03:01.666 --rc geninfo_all_blocks=1 00:03:01.666 --rc geninfo_unexecuted_blocks=1 00:03:01.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.666 ' 00:03:01.666 19:01:20 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:01.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.666 --rc genhtml_branch_coverage=1 00:03:01.666 --rc genhtml_function_coverage=1 00:03:01.666 --rc genhtml_legend=1 00:03:01.666 --rc geninfo_all_blocks=1 00:03:01.666 --rc geninfo_unexecuted_blocks=1 00:03:01.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.666 ' 00:03:01.666 19:01:20 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:01.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.666 --rc genhtml_branch_coverage=1 00:03:01.666 --rc genhtml_function_coverage=1 00:03:01.666 --rc genhtml_legend=1 00:03:01.666 --rc geninfo_all_blocks=1 00:03:01.666 --rc geninfo_unexecuted_blocks=1 00:03:01.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.666 ' 00:03:01.666 19:01:20 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:01.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.666 --rc genhtml_branch_coverage=1 00:03:01.666 --rc genhtml_function_coverage=1 00:03:01.666 --rc genhtml_legend=1 00:03:01.666 --rc geninfo_all_blocks=1 00:03:01.666 --rc geninfo_unexecuted_blocks=1 00:03:01.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.666 ' 00:03:01.666 19:01:20 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:01.666 19:01:20 -- nvmf/common.sh@7 -- # uname -s 00:03:01.666 19:01:20 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:01.666 19:01:20 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:01.666 19:01:20 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:01.666 19:01:20 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:01.666 19:01:20 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:01.666 19:01:20 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:01.666 19:01:20 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:01.666 19:01:20 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:01.666 19:01:20 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:01.666 19:01:20 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:01.666 19:01:20 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:01.666 19:01:20 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:01.666 19:01:20 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:01.666 19:01:20 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:01.666 19:01:20 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:01.666 19:01:20 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:01.926 19:01:20 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:01.926 19:01:20 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:01.926 19:01:20 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:01.926 19:01:20 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.926 19:01:20 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.926 19:01:20 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.926 19:01:20 -- paths/export.sh@5 -- # export PATH 00:03:01.926 19:01:20 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.926 19:01:20 -- nvmf/common.sh@46 -- # : 0 00:03:01.926 19:01:20 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:01.926 19:01:20 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:01.926 19:01:20 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:01.926 19:01:20 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:01.926 19:01:20 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:01.926 19:01:20 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:01.926 19:01:20 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:01.926 19:01:20 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:01.926 19:01:20 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:01.926 19:01:20 -- spdk/autotest.sh@32 -- # uname -s 00:03:01.926 19:01:20 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:01.926 19:01:20 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:01.926 19:01:20 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:01.926 19:01:20 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:01.926 19:01:20 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:01.926 19:01:20 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:01.926 19:01:20 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:01.926 19:01:20 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:01.926 19:01:20 -- spdk/autotest.sh@48 -- # udevadm_pid=1224796 00:03:01.926 19:01:20 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:01.926 19:01:20 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:01.926 19:01:20 -- spdk/autotest.sh@54 -- # echo 1224798 00:03:01.926 19:01:20 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:01.926 19:01:20 -- spdk/autotest.sh@56 -- # echo 1224799 00:03:01.926 19:01:20 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:01.926 19:01:20 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:01.926 19:01:20 -- spdk/autotest.sh@60 -- # echo 1224800 00:03:01.926 19:01:20 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:01.926 19:01:20 -- spdk/autotest.sh@62 -- # echo 1224801 00:03:01.926 19:01:20 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:01.926 19:01:20 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:01.926 19:01:20 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:01.926 19:01:20 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:01.926 19:01:20 -- common/autotest_common.sh@10 -- # set +x 00:03:01.926 19:01:20 -- spdk/autotest.sh@70 -- # create_test_list 00:03:01.927 19:01:20 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:01.927 19:01:20 -- common/autotest_common.sh@10 -- # set +x 00:03:01.927 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:01.927 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:01.927 19:01:20 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:01.927 19:01:20 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:01.927 19:01:20 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:01.927 19:01:20 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:01.927 19:01:20 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:01.927 19:01:20 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:01.927 19:01:20 -- common/autotest_common.sh@1450 -- # uname 00:03:01.927 19:01:20 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:03:01.927 19:01:20 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:01.927 19:01:20 -- common/autotest_common.sh@1470 -- # uname 00:03:01.927 19:01:20 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:03:01.927 19:01:20 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:03:01.927 19:01:20 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:01.927 lcov: LCOV version 1.15 00:03:01.927 19:01:20 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:03.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:03.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:03.833 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:16.053 19:01:34 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:16.053 19:01:34 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:16.054 19:01:34 -- common/autotest_common.sh@10 -- # set +x 00:03:16.054 19:01:34 -- spdk/autotest.sh@89 -- # rm -f 00:03:16.054 19:01:34 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:19.356 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:19.356 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:19.356 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:19.356 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:19.356 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:19.356 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:19.356 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:19.356 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:19.615 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:19.615 19:01:38 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:19.615 19:01:38 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:19.615 19:01:38 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:19.615 19:01:38 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:19.615 19:01:38 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:19.615 19:01:38 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:19.615 19:01:38 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:19.615 19:01:38 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:19.615 19:01:38 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:19.615 19:01:38 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:19.615 19:01:38 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:19.615 19:01:38 -- spdk/autotest.sh@108 -- # grep -v p 00:03:19.615 19:01:38 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:19.615 19:01:38 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:19.615 19:01:38 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:19.615 19:01:38 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:19.615 19:01:38 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:19.875 No valid GPT data, bailing 00:03:19.875 19:01:38 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:19.875 19:01:38 -- scripts/common.sh@393 -- # pt= 00:03:19.875 19:01:38 -- scripts/common.sh@394 -- # return 1 00:03:19.875 19:01:38 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:19.875 1+0 records in 00:03:19.875 1+0 records out 00:03:19.875 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0046305 s, 226 MB/s 00:03:19.875 19:01:38 -- spdk/autotest.sh@116 -- # sync 00:03:19.875 19:01:38 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:19.875 19:01:38 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:19.875 19:01:38 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:28.112 19:01:45 -- spdk/autotest.sh@122 -- # uname -s 00:03:28.112 19:01:45 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:28.112 19:01:45 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:28.112 19:01:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:28.112 19:01:45 -- common/autotest_common.sh@10 -- # set +x 00:03:28.112 ************************************ 00:03:28.112 START TEST setup.sh 00:03:28.112 ************************************ 00:03:28.112 19:01:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:28.112 * Looking for test storage... 00:03:28.112 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:28.112 19:01:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:28.112 19:01:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:28.112 19:01:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:28.112 19:01:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:28.112 19:01:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:28.112 19:01:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:28.112 19:01:45 -- scripts/common.sh@335 -- # IFS=.-: 00:03:28.112 19:01:45 -- scripts/common.sh@335 -- # read -ra ver1 00:03:28.112 19:01:45 -- scripts/common.sh@336 -- # IFS=.-: 00:03:28.112 19:01:45 -- scripts/common.sh@336 -- # read -ra ver2 00:03:28.112 19:01:45 -- scripts/common.sh@337 -- # local 'op=<' 00:03:28.112 19:01:45 -- scripts/common.sh@339 -- # ver1_l=2 00:03:28.112 19:01:45 -- scripts/common.sh@340 -- # ver2_l=1 00:03:28.112 19:01:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:28.112 19:01:45 -- scripts/common.sh@343 -- # case "$op" in 00:03:28.112 19:01:45 -- scripts/common.sh@344 -- # : 1 00:03:28.112 19:01:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:28.112 19:01:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:28.112 19:01:45 -- scripts/common.sh@364 -- # decimal 1 00:03:28.112 19:01:45 -- scripts/common.sh@352 -- # local d=1 00:03:28.112 19:01:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:28.112 19:01:45 -- scripts/common.sh@354 -- # echo 1 00:03:28.112 19:01:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:28.112 19:01:45 -- scripts/common.sh@365 -- # decimal 2 00:03:28.112 19:01:45 -- scripts/common.sh@352 -- # local d=2 00:03:28.112 19:01:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:28.112 19:01:45 -- scripts/common.sh@354 -- # echo 2 00:03:28.112 19:01:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:28.112 19:01:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:28.112 19:01:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:28.112 19:01:45 -- scripts/common.sh@367 -- # return 0 00:03:28.112 19:01:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:28.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.112 --rc genhtml_branch_coverage=1 00:03:28.112 --rc genhtml_function_coverage=1 00:03:28.112 --rc genhtml_legend=1 00:03:28.112 --rc geninfo_all_blocks=1 00:03:28.112 --rc geninfo_unexecuted_blocks=1 00:03:28.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.112 ' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:28.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.112 --rc genhtml_branch_coverage=1 00:03:28.112 --rc genhtml_function_coverage=1 00:03:28.112 --rc genhtml_legend=1 00:03:28.112 --rc geninfo_all_blocks=1 00:03:28.112 --rc geninfo_unexecuted_blocks=1 00:03:28.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.112 ' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:28.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.112 --rc genhtml_branch_coverage=1 00:03:28.112 --rc genhtml_function_coverage=1 00:03:28.112 --rc genhtml_legend=1 00:03:28.112 --rc geninfo_all_blocks=1 00:03:28.112 --rc geninfo_unexecuted_blocks=1 00:03:28.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.112 ' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:28.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.112 --rc genhtml_branch_coverage=1 00:03:28.112 --rc genhtml_function_coverage=1 00:03:28.112 --rc genhtml_legend=1 00:03:28.112 --rc geninfo_all_blocks=1 00:03:28.112 --rc geninfo_unexecuted_blocks=1 00:03:28.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.112 ' 00:03:28.112 19:01:45 -- setup/test-setup.sh@10 -- # uname -s 00:03:28.112 19:01:45 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:28.112 19:01:45 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:28.112 19:01:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:28.112 19:01:45 -- common/autotest_common.sh@10 -- # set +x 00:03:28.112 ************************************ 00:03:28.112 START TEST acl 00:03:28.112 ************************************ 00:03:28.112 19:01:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:28.112 * Looking for test storage... 00:03:28.112 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:28.112 19:01:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:28.112 19:01:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:28.112 19:01:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:28.112 19:01:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:28.112 19:01:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:28.112 19:01:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:28.112 19:01:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:28.112 19:01:45 -- scripts/common.sh@335 -- # IFS=.-: 00:03:28.112 19:01:45 -- scripts/common.sh@335 -- # read -ra ver1 00:03:28.112 19:01:45 -- scripts/common.sh@336 -- # IFS=.-: 00:03:28.112 19:01:45 -- scripts/common.sh@336 -- # read -ra ver2 00:03:28.112 19:01:45 -- scripts/common.sh@337 -- # local 'op=<' 00:03:28.113 19:01:45 -- scripts/common.sh@339 -- # ver1_l=2 00:03:28.113 19:01:45 -- scripts/common.sh@340 -- # ver2_l=1 00:03:28.113 19:01:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:28.113 19:01:45 -- scripts/common.sh@343 -- # case "$op" in 00:03:28.113 19:01:45 -- scripts/common.sh@344 -- # : 1 00:03:28.113 19:01:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:28.113 19:01:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:28.113 19:01:45 -- scripts/common.sh@364 -- # decimal 1 00:03:28.113 19:01:45 -- scripts/common.sh@352 -- # local d=1 00:03:28.113 19:01:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:28.113 19:01:45 -- scripts/common.sh@354 -- # echo 1 00:03:28.113 19:01:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:28.113 19:01:45 -- scripts/common.sh@365 -- # decimal 2 00:03:28.113 19:01:45 -- scripts/common.sh@352 -- # local d=2 00:03:28.113 19:01:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:28.113 19:01:45 -- scripts/common.sh@354 -- # echo 2 00:03:28.113 19:01:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:28.113 19:01:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:28.113 19:01:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:28.113 19:01:45 -- scripts/common.sh@367 -- # return 0 00:03:28.113 19:01:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:28.113 19:01:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:28.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.113 --rc genhtml_branch_coverage=1 00:03:28.113 --rc genhtml_function_coverage=1 00:03:28.113 --rc genhtml_legend=1 00:03:28.113 --rc geninfo_all_blocks=1 00:03:28.113 --rc geninfo_unexecuted_blocks=1 00:03:28.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.113 ' 00:03:28.113 19:01:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:28.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.113 --rc genhtml_branch_coverage=1 00:03:28.113 --rc genhtml_function_coverage=1 00:03:28.113 --rc genhtml_legend=1 00:03:28.113 --rc geninfo_all_blocks=1 00:03:28.113 --rc geninfo_unexecuted_blocks=1 00:03:28.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.113 ' 00:03:28.113 19:01:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:28.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.113 --rc genhtml_branch_coverage=1 00:03:28.113 --rc genhtml_function_coverage=1 00:03:28.113 --rc genhtml_legend=1 00:03:28.113 --rc geninfo_all_blocks=1 00:03:28.113 --rc geninfo_unexecuted_blocks=1 00:03:28.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.113 ' 00:03:28.113 19:01:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:28.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:28.113 --rc genhtml_branch_coverage=1 00:03:28.113 --rc genhtml_function_coverage=1 00:03:28.113 --rc genhtml_legend=1 00:03:28.113 --rc geninfo_all_blocks=1 00:03:28.113 --rc geninfo_unexecuted_blocks=1 00:03:28.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:28.113 ' 00:03:28.113 19:01:45 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:28.113 19:01:45 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:28.113 19:01:45 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:28.113 19:01:45 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:28.113 19:01:45 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:28.113 19:01:45 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:28.113 19:01:45 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:28.113 19:01:45 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:28.113 19:01:45 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:28.113 19:01:45 -- setup/acl.sh@12 -- # devs=() 00:03:28.113 19:01:45 -- setup/acl.sh@12 -- # declare -a devs 00:03:28.113 19:01:45 -- setup/acl.sh@13 -- # drivers=() 00:03:28.113 19:01:45 -- setup/acl.sh@13 -- # declare -A drivers 00:03:28.113 19:01:45 -- setup/acl.sh@51 -- # setup reset 00:03:28.113 19:01:45 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:28.113 19:01:45 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:31.405 19:01:49 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:31.405 19:01:49 -- setup/acl.sh@16 -- # local dev driver 00:03:31.405 19:01:49 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:31.405 19:01:49 -- setup/acl.sh@15 -- # setup output status 00:03:31.405 19:01:49 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:31.405 19:01:49 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:34.699 Hugepages 00:03:34.699 node hugesize free / total 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 00:03:34.699 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # continue 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:34.699 19:01:52 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:34.699 19:01:52 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:34.699 19:01:52 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:34.699 19:01:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.699 19:01:52 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:34.699 19:01:52 -- setup/acl.sh@54 -- # run_test denied denied 00:03:34.699 19:01:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:34.699 19:01:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:34.699 19:01:52 -- common/autotest_common.sh@10 -- # set +x 00:03:34.699 ************************************ 00:03:34.699 START TEST denied 00:03:34.699 ************************************ 00:03:34.699 19:01:52 -- common/autotest_common.sh@1114 -- # denied 00:03:34.699 19:01:52 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:34.699 19:01:52 -- setup/acl.sh@38 -- # setup output config 00:03:34.699 19:01:52 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:34.699 19:01:52 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.699 19:01:52 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:37.991 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:37.991 19:01:56 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:37.991 19:01:56 -- setup/acl.sh@28 -- # local dev driver 00:03:37.991 19:01:56 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:37.991 19:01:56 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:37.991 19:01:56 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:37.991 19:01:56 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:37.991 19:01:56 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:37.991 19:01:56 -- setup/acl.sh@41 -- # setup reset 00:03:37.991 19:01:56 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:37.991 19:01:56 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:43.267 00:03:43.267 real 0m7.931s 00:03:43.267 user 0m2.530s 00:03:43.267 sys 0m4.704s 00:03:43.267 19:02:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:43.267 19:02:00 -- common/autotest_common.sh@10 -- # set +x 00:03:43.267 ************************************ 00:03:43.267 END TEST denied 00:03:43.267 ************************************ 00:03:43.267 19:02:00 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:43.267 19:02:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:43.267 19:02:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:43.267 19:02:00 -- common/autotest_common.sh@10 -- # set +x 00:03:43.267 ************************************ 00:03:43.267 START TEST allowed 00:03:43.267 ************************************ 00:03:43.267 19:02:00 -- common/autotest_common.sh@1114 -- # allowed 00:03:43.267 19:02:00 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:43.267 19:02:00 -- setup/acl.sh@45 -- # setup output config 00:03:43.267 19:02:00 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:43.267 19:02:00 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:43.267 19:02:00 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:47.463 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:47.463 19:02:06 -- setup/acl.sh@47 -- # verify 00:03:47.463 19:02:06 -- setup/acl.sh@28 -- # local dev driver 00:03:47.463 19:02:06 -- setup/acl.sh@48 -- # setup reset 00:03:47.463 19:02:06 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:47.463 19:02:06 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:51.662 00:03:51.662 real 0m8.817s 00:03:51.662 user 0m2.488s 00:03:51.662 sys 0m4.862s 00:03:51.662 19:02:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:51.662 19:02:09 -- common/autotest_common.sh@10 -- # set +x 00:03:51.662 ************************************ 00:03:51.662 END TEST allowed 00:03:51.662 ************************************ 00:03:51.662 00:03:51.662 real 0m24.301s 00:03:51.662 user 0m7.786s 00:03:51.662 sys 0m14.653s 00:03:51.662 19:02:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:51.662 19:02:09 -- common/autotest_common.sh@10 -- # set +x 00:03:51.662 ************************************ 00:03:51.662 END TEST acl 00:03:51.662 ************************************ 00:03:51.662 19:02:09 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:51.662 19:02:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:51.662 19:02:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:51.662 19:02:09 -- common/autotest_common.sh@10 -- # set +x 00:03:51.662 ************************************ 00:03:51.662 START TEST hugepages 00:03:51.662 ************************************ 00:03:51.662 19:02:09 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:51.662 * Looking for test storage... 00:03:51.662 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:51.662 19:02:09 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:51.662 19:02:09 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:51.662 19:02:09 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:51.662 19:02:10 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:51.662 19:02:10 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:51.662 19:02:10 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:51.662 19:02:10 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:51.662 19:02:10 -- scripts/common.sh@335 -- # IFS=.-: 00:03:51.662 19:02:10 -- scripts/common.sh@335 -- # read -ra ver1 00:03:51.662 19:02:10 -- scripts/common.sh@336 -- # IFS=.-: 00:03:51.662 19:02:10 -- scripts/common.sh@336 -- # read -ra ver2 00:03:51.662 19:02:10 -- scripts/common.sh@337 -- # local 'op=<' 00:03:51.662 19:02:10 -- scripts/common.sh@339 -- # ver1_l=2 00:03:51.662 19:02:10 -- scripts/common.sh@340 -- # ver2_l=1 00:03:51.662 19:02:10 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:51.662 19:02:10 -- scripts/common.sh@343 -- # case "$op" in 00:03:51.662 19:02:10 -- scripts/common.sh@344 -- # : 1 00:03:51.662 19:02:10 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:51.662 19:02:10 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:51.662 19:02:10 -- scripts/common.sh@364 -- # decimal 1 00:03:51.662 19:02:10 -- scripts/common.sh@352 -- # local d=1 00:03:51.662 19:02:10 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:51.662 19:02:10 -- scripts/common.sh@354 -- # echo 1 00:03:51.662 19:02:10 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:51.662 19:02:10 -- scripts/common.sh@365 -- # decimal 2 00:03:51.662 19:02:10 -- scripts/common.sh@352 -- # local d=2 00:03:51.662 19:02:10 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:51.662 19:02:10 -- scripts/common.sh@354 -- # echo 2 00:03:51.662 19:02:10 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:51.662 19:02:10 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:51.662 19:02:10 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:51.662 19:02:10 -- scripts/common.sh@367 -- # return 0 00:03:51.662 19:02:10 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:51.662 19:02:10 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:51.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:51.662 --rc genhtml_branch_coverage=1 00:03:51.662 --rc genhtml_function_coverage=1 00:03:51.662 --rc genhtml_legend=1 00:03:51.662 --rc geninfo_all_blocks=1 00:03:51.662 --rc geninfo_unexecuted_blocks=1 00:03:51.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:51.662 ' 00:03:51.662 19:02:10 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:51.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:51.662 --rc genhtml_branch_coverage=1 00:03:51.662 --rc genhtml_function_coverage=1 00:03:51.662 --rc genhtml_legend=1 00:03:51.662 --rc geninfo_all_blocks=1 00:03:51.662 --rc geninfo_unexecuted_blocks=1 00:03:51.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:51.662 ' 00:03:51.662 19:02:10 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:51.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:51.662 --rc genhtml_branch_coverage=1 00:03:51.662 --rc genhtml_function_coverage=1 00:03:51.662 --rc genhtml_legend=1 00:03:51.662 --rc geninfo_all_blocks=1 00:03:51.662 --rc geninfo_unexecuted_blocks=1 00:03:51.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:51.662 ' 00:03:51.662 19:02:10 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:51.662 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:51.662 --rc genhtml_branch_coverage=1 00:03:51.662 --rc genhtml_function_coverage=1 00:03:51.662 --rc genhtml_legend=1 00:03:51.662 --rc geninfo_all_blocks=1 00:03:51.662 --rc geninfo_unexecuted_blocks=1 00:03:51.662 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:51.662 ' 00:03:51.662 19:02:10 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:51.662 19:02:10 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:51.662 19:02:10 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:51.662 19:02:10 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:51.662 19:02:10 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:51.662 19:02:10 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:51.662 19:02:10 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:51.662 19:02:10 -- setup/common.sh@18 -- # local node= 00:03:51.662 19:02:10 -- setup/common.sh@19 -- # local var val 00:03:51.662 19:02:10 -- setup/common.sh@20 -- # local mem_f mem 00:03:51.662 19:02:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:51.662 19:02:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:51.662 19:02:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:51.662 19:02:10 -- setup/common.sh@28 -- # mapfile -t mem 00:03:51.662 19:02:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.662 19:02:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 40860660 kB' 'MemAvailable: 44581324 kB' 'Buffers: 8940 kB' 'Cached: 11205500 kB' 'SwapCached: 0 kB' 'Active: 7971380 kB' 'Inactive: 3688336 kB' 'Active(anon): 7553852 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 449352 kB' 'Mapped: 168496 kB' 'Shmem: 7108576 kB' 'KReclaimable: 223332 kB' 'Slab: 911072 kB' 'SReclaimable: 223332 kB' 'SUnreclaim: 687740 kB' 'KernelStack: 21872 kB' 'PageTables: 7908 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433336 kB' 'Committed_AS: 8779440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.662 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.662 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.663 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.663 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # continue 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:51.664 19:02:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:51.664 19:02:10 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:51.664 19:02:10 -- setup/common.sh@33 -- # echo 2048 00:03:51.664 19:02:10 -- setup/common.sh@33 -- # return 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:51.664 19:02:10 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:51.664 19:02:10 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:51.664 19:02:10 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:51.664 19:02:10 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:51.664 19:02:10 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:51.664 19:02:10 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:51.664 19:02:10 -- setup/hugepages.sh@207 -- # get_nodes 00:03:51.664 19:02:10 -- setup/hugepages.sh@27 -- # local node 00:03:51.664 19:02:10 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:51.664 19:02:10 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:51.664 19:02:10 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:51.664 19:02:10 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:51.664 19:02:10 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:51.664 19:02:10 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:51.664 19:02:10 -- setup/hugepages.sh@208 -- # clear_hp 00:03:51.664 19:02:10 -- setup/hugepages.sh@37 -- # local node hp 00:03:51.664 19:02:10 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:51.664 19:02:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.664 19:02:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.664 19:02:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:51.664 19:02:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.664 19:02:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:51.664 19:02:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:51.664 19:02:10 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:51.664 19:02:10 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:51.664 19:02:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:51.664 19:02:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:51.664 19:02:10 -- common/autotest_common.sh@10 -- # set +x 00:03:51.664 ************************************ 00:03:51.664 START TEST default_setup 00:03:51.664 ************************************ 00:03:51.664 19:02:10 -- common/autotest_common.sh@1114 -- # default_setup 00:03:51.664 19:02:10 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:51.664 19:02:10 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:51.664 19:02:10 -- setup/hugepages.sh@51 -- # shift 00:03:51.664 19:02:10 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:51.664 19:02:10 -- setup/hugepages.sh@52 -- # local node_ids 00:03:51.664 19:02:10 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:51.664 19:02:10 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:51.664 19:02:10 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:51.664 19:02:10 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:51.664 19:02:10 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:51.664 19:02:10 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:51.664 19:02:10 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:51.664 19:02:10 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:51.664 19:02:10 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:51.664 19:02:10 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:51.664 19:02:10 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:51.664 19:02:10 -- setup/hugepages.sh@73 -- # return 0 00:03:51.664 19:02:10 -- setup/hugepages.sh@137 -- # setup output 00:03:51.664 19:02:10 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.664 19:02:10 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:54.955 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:54.955 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:56.868 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:56.868 19:02:15 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:56.868 19:02:15 -- setup/hugepages.sh@89 -- # local node 00:03:56.868 19:02:15 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:56.868 19:02:15 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:56.868 19:02:15 -- setup/hugepages.sh@92 -- # local surp 00:03:56.868 19:02:15 -- setup/hugepages.sh@93 -- # local resv 00:03:56.868 19:02:15 -- setup/hugepages.sh@94 -- # local anon 00:03:56.868 19:02:15 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:56.868 19:02:15 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:56.868 19:02:15 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:56.868 19:02:15 -- setup/common.sh@18 -- # local node= 00:03:56.868 19:02:15 -- setup/common.sh@19 -- # local var val 00:03:56.868 19:02:15 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.868 19:02:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.868 19:02:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.868 19:02:15 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.868 19:02:15 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.868 19:02:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.868 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.868 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43041752 kB' 'MemAvailable: 46761916 kB' 'Buffers: 8940 kB' 'Cached: 11205644 kB' 'SwapCached: 0 kB' 'Active: 7969908 kB' 'Inactive: 3688336 kB' 'Active(anon): 7552380 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 447132 kB' 'Mapped: 168328 kB' 'Shmem: 7108720 kB' 'KReclaimable: 222332 kB' 'Slab: 908012 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685680 kB' 'KernelStack: 22144 kB' 'PageTables: 8004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8782208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.869 19:02:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.869 19:02:15 -- setup/common.sh@33 -- # echo 0 00:03:56.869 19:02:15 -- setup/common.sh@33 -- # return 0 00:03:56.869 19:02:15 -- setup/hugepages.sh@97 -- # anon=0 00:03:56.869 19:02:15 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:56.869 19:02:15 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.869 19:02:15 -- setup/common.sh@18 -- # local node= 00:03:56.869 19:02:15 -- setup/common.sh@19 -- # local var val 00:03:56.869 19:02:15 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.869 19:02:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.869 19:02:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.869 19:02:15 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.869 19:02:15 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.869 19:02:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.869 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43043080 kB' 'MemAvailable: 46763244 kB' 'Buffers: 8940 kB' 'Cached: 11205648 kB' 'SwapCached: 0 kB' 'Active: 7970408 kB' 'Inactive: 3688336 kB' 'Active(anon): 7552880 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 447564 kB' 'Mapped: 168320 kB' 'Shmem: 7108724 kB' 'KReclaimable: 222332 kB' 'Slab: 908096 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685764 kB' 'KernelStack: 22080 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8782808 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.870 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.870 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.871 19:02:15 -- setup/common.sh@33 -- # echo 0 00:03:56.871 19:02:15 -- setup/common.sh@33 -- # return 0 00:03:56.871 19:02:15 -- setup/hugepages.sh@99 -- # surp=0 00:03:56.871 19:02:15 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:56.871 19:02:15 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:56.871 19:02:15 -- setup/common.sh@18 -- # local node= 00:03:56.871 19:02:15 -- setup/common.sh@19 -- # local var val 00:03:56.871 19:02:15 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.871 19:02:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.871 19:02:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.871 19:02:15 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.871 19:02:15 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.871 19:02:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43044424 kB' 'MemAvailable: 46764588 kB' 'Buffers: 8940 kB' 'Cached: 11205660 kB' 'SwapCached: 0 kB' 'Active: 7971232 kB' 'Inactive: 3688336 kB' 'Active(anon): 7553704 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448328 kB' 'Mapped: 168748 kB' 'Shmem: 7108736 kB' 'KReclaimable: 222332 kB' 'Slab: 907964 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685632 kB' 'KernelStack: 22080 kB' 'PageTables: 8096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8785100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.871 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.871 19:02:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.872 19:02:15 -- setup/common.sh@33 -- # echo 0 00:03:56.872 19:02:15 -- setup/common.sh@33 -- # return 0 00:03:56.872 19:02:15 -- setup/hugepages.sh@100 -- # resv=0 00:03:56.872 19:02:15 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:56.872 nr_hugepages=1024 00:03:56.872 19:02:15 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:56.872 resv_hugepages=0 00:03:56.872 19:02:15 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:56.872 surplus_hugepages=0 00:03:56.872 19:02:15 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:56.872 anon_hugepages=0 00:03:56.872 19:02:15 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.872 19:02:15 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:56.872 19:02:15 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:56.872 19:02:15 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:56.872 19:02:15 -- setup/common.sh@18 -- # local node= 00:03:56.872 19:02:15 -- setup/common.sh@19 -- # local var val 00:03:56.872 19:02:15 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.872 19:02:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.872 19:02:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.872 19:02:15 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.872 19:02:15 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.872 19:02:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43043120 kB' 'MemAvailable: 46763284 kB' 'Buffers: 8940 kB' 'Cached: 11205676 kB' 'SwapCached: 0 kB' 'Active: 7975436 kB' 'Inactive: 3688336 kB' 'Active(anon): 7557908 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 452564 kB' 'Mapped: 169028 kB' 'Shmem: 7108752 kB' 'KReclaimable: 222332 kB' 'Slab: 907964 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685632 kB' 'KernelStack: 22064 kB' 'PageTables: 8036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8788956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214468 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.872 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.872 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.873 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.873 19:02:15 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.874 19:02:15 -- setup/common.sh@33 -- # echo 1024 00:03:56.874 19:02:15 -- setup/common.sh@33 -- # return 0 00:03:56.874 19:02:15 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.874 19:02:15 -- setup/hugepages.sh@112 -- # get_nodes 00:03:56.874 19:02:15 -- setup/hugepages.sh@27 -- # local node 00:03:56.874 19:02:15 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.874 19:02:15 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:56.874 19:02:15 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.874 19:02:15 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:56.874 19:02:15 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:56.874 19:02:15 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:56.874 19:02:15 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:56.874 19:02:15 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:56.874 19:02:15 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:56.874 19:02:15 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.874 19:02:15 -- setup/common.sh@18 -- # local node=0 00:03:56.874 19:02:15 -- setup/common.sh@19 -- # local var val 00:03:56.874 19:02:15 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.874 19:02:15 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.874 19:02:15 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:56.874 19:02:15 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:56.874 19:02:15 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.874 19:02:15 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.874 19:02:15 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18356056 kB' 'MemUsed: 14229312 kB' 'SwapCached: 0 kB' 'Active: 6614856 kB' 'Inactive: 3531488 kB' 'Active(anon): 6337004 kB' 'Inactive(anon): 0 kB' 'Active(file): 277852 kB' 'Inactive(file): 3531488 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9861284 kB' 'Mapped: 150020 kB' 'AnonPages: 288184 kB' 'Shmem: 6051944 kB' 'KernelStack: 13240 kB' 'PageTables: 5688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146488 kB' 'Slab: 484736 kB' 'SReclaimable: 146488 kB' 'SUnreclaim: 338248 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.874 19:02:15 -- setup/common.sh@32 -- # continue 00:03:56.874 19:02:15 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.875 19:02:15 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.875 19:02:15 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.875 19:02:15 -- setup/common.sh@33 -- # echo 0 00:03:56.875 19:02:15 -- setup/common.sh@33 -- # return 0 00:03:56.875 19:02:15 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:56.875 19:02:15 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:56.875 19:02:15 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:56.875 19:02:15 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:56.875 19:02:15 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:56.875 node0=1024 expecting 1024 00:03:56.875 19:02:15 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:56.875 00:03:56.875 real 0m5.056s 00:03:56.875 user 0m1.294s 00:03:56.875 sys 0m2.295s 00:03:56.875 19:02:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:56.875 19:02:15 -- common/autotest_common.sh@10 -- # set +x 00:03:56.875 ************************************ 00:03:56.875 END TEST default_setup 00:03:56.875 ************************************ 00:03:56.875 19:02:15 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:56.875 19:02:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:56.875 19:02:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:56.875 19:02:15 -- common/autotest_common.sh@10 -- # set +x 00:03:56.875 ************************************ 00:03:56.875 START TEST per_node_1G_alloc 00:03:56.875 ************************************ 00:03:56.875 19:02:15 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:56.875 19:02:15 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:56.875 19:02:15 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:56.875 19:02:15 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:56.875 19:02:15 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:56.875 19:02:15 -- setup/hugepages.sh@51 -- # shift 00:03:56.875 19:02:15 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:56.875 19:02:15 -- setup/hugepages.sh@52 -- # local node_ids 00:03:56.875 19:02:15 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:56.875 19:02:15 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:56.875 19:02:15 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:56.875 19:02:15 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:56.875 19:02:15 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:56.875 19:02:15 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:56.875 19:02:15 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:56.875 19:02:15 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:56.875 19:02:15 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:56.875 19:02:15 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:56.875 19:02:15 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:56.875 19:02:15 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:56.875 19:02:15 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:56.875 19:02:15 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:56.875 19:02:15 -- setup/hugepages.sh@73 -- # return 0 00:03:56.875 19:02:15 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:56.875 19:02:15 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:56.875 19:02:15 -- setup/hugepages.sh@146 -- # setup output 00:03:56.875 19:02:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:56.875 19:02:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:00.174 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:00.174 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:00.174 19:02:18 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:00.174 19:02:18 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:00.174 19:02:18 -- setup/hugepages.sh@89 -- # local node 00:04:00.174 19:02:18 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:00.174 19:02:18 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:00.174 19:02:18 -- setup/hugepages.sh@92 -- # local surp 00:04:00.174 19:02:18 -- setup/hugepages.sh@93 -- # local resv 00:04:00.174 19:02:18 -- setup/hugepages.sh@94 -- # local anon 00:04:00.174 19:02:18 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:00.174 19:02:18 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:00.174 19:02:18 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:00.174 19:02:18 -- setup/common.sh@18 -- # local node= 00:04:00.174 19:02:18 -- setup/common.sh@19 -- # local var val 00:04:00.174 19:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.174 19:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.174 19:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.174 19:02:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.174 19:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.174 19:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.174 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.174 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43073780 kB' 'MemAvailable: 46793944 kB' 'Buffers: 8940 kB' 'Cached: 11205748 kB' 'SwapCached: 0 kB' 'Active: 7971276 kB' 'Inactive: 3688336 kB' 'Active(anon): 7553748 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448304 kB' 'Mapped: 168304 kB' 'Shmem: 7108824 kB' 'KReclaimable: 222332 kB' 'Slab: 908108 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685776 kB' 'KernelStack: 22000 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8783672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214576 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.175 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.175 19:02:18 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.176 19:02:18 -- setup/common.sh@33 -- # echo 0 00:04:00.176 19:02:18 -- setup/common.sh@33 -- # return 0 00:04:00.176 19:02:18 -- setup/hugepages.sh@97 -- # anon=0 00:04:00.176 19:02:18 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:00.176 19:02:18 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.176 19:02:18 -- setup/common.sh@18 -- # local node= 00:04:00.176 19:02:18 -- setup/common.sh@19 -- # local var val 00:04:00.176 19:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.176 19:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.176 19:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.176 19:02:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.176 19:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.176 19:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43076640 kB' 'MemAvailable: 46796804 kB' 'Buffers: 8940 kB' 'Cached: 11205752 kB' 'SwapCached: 0 kB' 'Active: 7971584 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554056 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448624 kB' 'Mapped: 168284 kB' 'Shmem: 7108828 kB' 'KReclaimable: 222332 kB' 'Slab: 908096 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685764 kB' 'KernelStack: 22000 kB' 'PageTables: 7912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8782168 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214592 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.176 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.176 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.177 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.177 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.178 19:02:18 -- setup/common.sh@33 -- # echo 0 00:04:00.178 19:02:18 -- setup/common.sh@33 -- # return 0 00:04:00.178 19:02:18 -- setup/hugepages.sh@99 -- # surp=0 00:04:00.178 19:02:18 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:00.178 19:02:18 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:00.178 19:02:18 -- setup/common.sh@18 -- # local node= 00:04:00.178 19:02:18 -- setup/common.sh@19 -- # local var val 00:04:00.178 19:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.178 19:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.178 19:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.178 19:02:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.178 19:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.178 19:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43074968 kB' 'MemAvailable: 46795132 kB' 'Buffers: 8940 kB' 'Cached: 11205764 kB' 'SwapCached: 0 kB' 'Active: 7971676 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554148 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448728 kB' 'Mapped: 168284 kB' 'Shmem: 7108840 kB' 'KReclaimable: 222332 kB' 'Slab: 908096 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685764 kB' 'KernelStack: 22064 kB' 'PageTables: 8044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8795504 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214576 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.178 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.178 19:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.179 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.179 19:02:18 -- setup/common.sh@33 -- # echo 0 00:04:00.179 19:02:18 -- setup/common.sh@33 -- # return 0 00:04:00.179 19:02:18 -- setup/hugepages.sh@100 -- # resv=0 00:04:00.179 19:02:18 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:00.179 nr_hugepages=1024 00:04:00.179 19:02:18 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:00.179 resv_hugepages=0 00:04:00.179 19:02:18 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:00.179 surplus_hugepages=0 00:04:00.179 19:02:18 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:00.179 anon_hugepages=0 00:04:00.179 19:02:18 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:00.179 19:02:18 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:00.179 19:02:18 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:00.179 19:02:18 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:00.179 19:02:18 -- setup/common.sh@18 -- # local node= 00:04:00.179 19:02:18 -- setup/common.sh@19 -- # local var val 00:04:00.179 19:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.179 19:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.179 19:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.179 19:02:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.179 19:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.179 19:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.179 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43073800 kB' 'MemAvailable: 46793964 kB' 'Buffers: 8940 kB' 'Cached: 11205776 kB' 'SwapCached: 0 kB' 'Active: 7972116 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554588 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 449204 kB' 'Mapped: 168284 kB' 'Shmem: 7108852 kB' 'KReclaimable: 222332 kB' 'Slab: 908284 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 685952 kB' 'KernelStack: 22256 kB' 'PageTables: 8376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8783348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214576 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.180 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.180 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.181 19:02:18 -- setup/common.sh@33 -- # echo 1024 00:04:00.181 19:02:18 -- setup/common.sh@33 -- # return 0 00:04:00.181 19:02:18 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:00.181 19:02:18 -- setup/hugepages.sh@112 -- # get_nodes 00:04:00.181 19:02:18 -- setup/hugepages.sh@27 -- # local node 00:04:00.181 19:02:18 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.181 19:02:18 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:00.181 19:02:18 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.181 19:02:18 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:00.181 19:02:18 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:00.181 19:02:18 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:00.181 19:02:18 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:00.181 19:02:18 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:00.181 19:02:18 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:00.181 19:02:18 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.181 19:02:18 -- setup/common.sh@18 -- # local node=0 00:04:00.181 19:02:18 -- setup/common.sh@19 -- # local var val 00:04:00.181 19:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.181 19:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.181 19:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:00.181 19:02:18 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:00.181 19:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.181 19:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.181 19:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19404176 kB' 'MemUsed: 13181192 kB' 'SwapCached: 0 kB' 'Active: 6615620 kB' 'Inactive: 3531488 kB' 'Active(anon): 6337768 kB' 'Inactive(anon): 0 kB' 'Active(file): 277852 kB' 'Inactive(file): 3531488 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9861300 kB' 'Mapped: 149556 kB' 'AnonPages: 289124 kB' 'Shmem: 6051960 kB' 'KernelStack: 13400 kB' 'PageTables: 6360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146488 kB' 'Slab: 485124 kB' 'SReclaimable: 146488 kB' 'SUnreclaim: 338636 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.181 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.181 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.182 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.182 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@33 -- # echo 0 00:04:00.183 19:02:18 -- setup/common.sh@33 -- # return 0 00:04:00.183 19:02:18 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:00.183 19:02:18 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:00.183 19:02:18 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:00.183 19:02:18 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:00.183 19:02:18 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.183 19:02:18 -- setup/common.sh@18 -- # local node=1 00:04:00.183 19:02:18 -- setup/common.sh@19 -- # local var val 00:04:00.183 19:02:18 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.183 19:02:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.183 19:02:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:00.183 19:02:18 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:00.183 19:02:18 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.183 19:02:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 23668640 kB' 'MemUsed: 4029764 kB' 'SwapCached: 0 kB' 'Active: 1356700 kB' 'Inactive: 156848 kB' 'Active(anon): 1217024 kB' 'Inactive(anon): 0 kB' 'Active(file): 139676 kB' 'Inactive(file): 156848 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1353448 kB' 'Mapped: 18728 kB' 'AnonPages: 160220 kB' 'Shmem: 1056924 kB' 'KernelStack: 8920 kB' 'PageTables: 2512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 75844 kB' 'Slab: 423244 kB' 'SReclaimable: 75844 kB' 'SUnreclaim: 347400 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.183 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.183 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # continue 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.184 19:02:18 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.184 19:02:18 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.184 19:02:18 -- setup/common.sh@33 -- # echo 0 00:04:00.184 19:02:18 -- setup/common.sh@33 -- # return 0 00:04:00.184 19:02:18 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:00.184 19:02:18 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:00.184 19:02:18 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:00.184 19:02:18 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:00.184 node0=512 expecting 512 00:04:00.184 19:02:18 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:00.184 19:02:18 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:00.184 19:02:18 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:00.184 19:02:18 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:00.184 node1=512 expecting 512 00:04:00.184 19:02:18 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:00.184 00:04:00.184 real 0m3.194s 00:04:00.184 user 0m1.148s 00:04:00.184 sys 0m1.962s 00:04:00.184 19:02:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:00.184 19:02:18 -- common/autotest_common.sh@10 -- # set +x 00:04:00.184 ************************************ 00:04:00.184 END TEST per_node_1G_alloc 00:04:00.184 ************************************ 00:04:00.184 19:02:18 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:00.184 19:02:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:00.184 19:02:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:00.184 19:02:18 -- common/autotest_common.sh@10 -- # set +x 00:04:00.184 ************************************ 00:04:00.184 START TEST even_2G_alloc 00:04:00.184 ************************************ 00:04:00.184 19:02:18 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:04:00.184 19:02:18 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:00.184 19:02:18 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:00.184 19:02:18 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:00.184 19:02:18 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:00.184 19:02:18 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:00.184 19:02:18 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:00.184 19:02:18 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:00.184 19:02:18 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:00.184 19:02:18 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:00.184 19:02:18 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:00.184 19:02:18 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:00.184 19:02:18 -- setup/hugepages.sh@83 -- # : 512 00:04:00.184 19:02:18 -- setup/hugepages.sh@84 -- # : 1 00:04:00.184 19:02:18 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:00.184 19:02:18 -- setup/hugepages.sh@83 -- # : 0 00:04:00.184 19:02:18 -- setup/hugepages.sh@84 -- # : 0 00:04:00.184 19:02:18 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:00.184 19:02:18 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:00.184 19:02:18 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:00.184 19:02:18 -- setup/hugepages.sh@153 -- # setup output 00:04:00.184 19:02:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.184 19:02:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:03.483 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:03.483 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:03.483 19:02:21 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:03.483 19:02:21 -- setup/hugepages.sh@89 -- # local node 00:04:03.483 19:02:21 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:03.483 19:02:21 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:03.483 19:02:21 -- setup/hugepages.sh@92 -- # local surp 00:04:03.483 19:02:21 -- setup/hugepages.sh@93 -- # local resv 00:04:03.483 19:02:21 -- setup/hugepages.sh@94 -- # local anon 00:04:03.483 19:02:21 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:03.483 19:02:21 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:03.483 19:02:21 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:03.483 19:02:21 -- setup/common.sh@18 -- # local node= 00:04:03.483 19:02:21 -- setup/common.sh@19 -- # local var val 00:04:03.483 19:02:21 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.483 19:02:21 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.483 19:02:21 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.484 19:02:21 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.484 19:02:21 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.484 19:02:21 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43047148 kB' 'MemAvailable: 46767312 kB' 'Buffers: 8940 kB' 'Cached: 11205888 kB' 'SwapCached: 0 kB' 'Active: 7973568 kB' 'Inactive: 3688336 kB' 'Active(anon): 7556040 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 450372 kB' 'Mapped: 167656 kB' 'Shmem: 7108964 kB' 'KReclaimable: 222332 kB' 'Slab: 908588 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 686256 kB' 'KernelStack: 21840 kB' 'PageTables: 7492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8777112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214480 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:21 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:21 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.484 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.484 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:03.485 19:02:22 -- setup/common.sh@33 -- # echo 0 00:04:03.485 19:02:22 -- setup/common.sh@33 -- # return 0 00:04:03.485 19:02:22 -- setup/hugepages.sh@97 -- # anon=0 00:04:03.485 19:02:22 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:03.485 19:02:22 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.485 19:02:22 -- setup/common.sh@18 -- # local node= 00:04:03.485 19:02:22 -- setup/common.sh@19 -- # local var val 00:04:03.485 19:02:22 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.485 19:02:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.485 19:02:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.485 19:02:22 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.485 19:02:22 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.485 19:02:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.485 19:02:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43042676 kB' 'MemAvailable: 46762840 kB' 'Buffers: 8940 kB' 'Cached: 11205892 kB' 'SwapCached: 0 kB' 'Active: 7969676 kB' 'Inactive: 3688336 kB' 'Active(anon): 7552148 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 446432 kB' 'Mapped: 167144 kB' 'Shmem: 7108968 kB' 'KReclaimable: 222332 kB' 'Slab: 908720 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 686388 kB' 'KernelStack: 21808 kB' 'PageTables: 7404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8772996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.485 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.485 19:02:22 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.486 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.486 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.486 19:02:22 -- setup/common.sh@33 -- # echo 0 00:04:03.486 19:02:22 -- setup/common.sh@33 -- # return 0 00:04:03.486 19:02:22 -- setup/hugepages.sh@99 -- # surp=0 00:04:03.486 19:02:22 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:03.486 19:02:22 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:03.486 19:02:22 -- setup/common.sh@18 -- # local node= 00:04:03.486 19:02:22 -- setup/common.sh@19 -- # local var val 00:04:03.486 19:02:22 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.486 19:02:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.486 19:02:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.487 19:02:22 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.487 19:02:22 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.487 19:02:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43043060 kB' 'MemAvailable: 46763224 kB' 'Buffers: 8940 kB' 'Cached: 11205904 kB' 'SwapCached: 0 kB' 'Active: 7969288 kB' 'Inactive: 3688336 kB' 'Active(anon): 7551760 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 446112 kB' 'Mapped: 167144 kB' 'Shmem: 7108980 kB' 'KReclaimable: 222332 kB' 'Slab: 908720 kB' 'SReclaimable: 222332 kB' 'SUnreclaim: 686388 kB' 'KernelStack: 21808 kB' 'PageTables: 7392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8773152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214480 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.487 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.487 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.488 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.488 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:03.488 19:02:22 -- setup/common.sh@33 -- # echo 0 00:04:03.488 19:02:22 -- setup/common.sh@33 -- # return 0 00:04:03.488 19:02:22 -- setup/hugepages.sh@100 -- # resv=0 00:04:03.488 19:02:22 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:03.488 nr_hugepages=1024 00:04:03.750 19:02:22 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:03.750 resv_hugepages=0 00:04:03.750 19:02:22 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:03.750 surplus_hugepages=0 00:04:03.750 19:02:22 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:03.750 anon_hugepages=0 00:04:03.750 19:02:22 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.750 19:02:22 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:03.750 19:02:22 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:03.750 19:02:22 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:03.750 19:02:22 -- setup/common.sh@18 -- # local node= 00:04:03.750 19:02:22 -- setup/common.sh@19 -- # local var val 00:04:03.750 19:02:22 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.750 19:02:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.750 19:02:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:03.750 19:02:22 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:03.750 19:02:22 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.750 19:02:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43046444 kB' 'MemAvailable: 46766600 kB' 'Buffers: 8940 kB' 'Cached: 11205920 kB' 'SwapCached: 0 kB' 'Active: 7970176 kB' 'Inactive: 3688336 kB' 'Active(anon): 7552648 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 447036 kB' 'Mapped: 167144 kB' 'Shmem: 7108996 kB' 'KReclaimable: 222316 kB' 'Slab: 908696 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686380 kB' 'KernelStack: 21824 kB' 'PageTables: 7452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8773028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.750 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.750 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.751 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.751 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:03.751 19:02:22 -- setup/common.sh@33 -- # echo 1024 00:04:03.751 19:02:22 -- setup/common.sh@33 -- # return 0 00:04:03.751 19:02:22 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:03.751 19:02:22 -- setup/hugepages.sh@112 -- # get_nodes 00:04:03.751 19:02:22 -- setup/hugepages.sh@27 -- # local node 00:04:03.752 19:02:22 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.752 19:02:22 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:03.752 19:02:22 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:03.752 19:02:22 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:03.752 19:02:22 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:03.752 19:02:22 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:03.752 19:02:22 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.752 19:02:22 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.752 19:02:22 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:03.752 19:02:22 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.752 19:02:22 -- setup/common.sh@18 -- # local node=0 00:04:03.752 19:02:22 -- setup/common.sh@19 -- # local var val 00:04:03.752 19:02:22 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.752 19:02:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.752 19:02:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:03.752 19:02:22 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:03.752 19:02:22 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.752 19:02:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19394888 kB' 'MemUsed: 13190480 kB' 'SwapCached: 0 kB' 'Active: 6615844 kB' 'Inactive: 3531488 kB' 'Active(anon): 6337992 kB' 'Inactive(anon): 0 kB' 'Active(file): 277852 kB' 'Inactive(file): 3531488 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9861356 kB' 'Mapped: 149292 kB' 'AnonPages: 289176 kB' 'Shmem: 6052016 kB' 'KernelStack: 13000 kB' 'PageTables: 5280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146488 kB' 'Slab: 485152 kB' 'SReclaimable: 146488 kB' 'SUnreclaim: 338664 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.752 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.752 19:02:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@33 -- # echo 0 00:04:03.753 19:02:22 -- setup/common.sh@33 -- # return 0 00:04:03.753 19:02:22 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.753 19:02:22 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:03.753 19:02:22 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:03.753 19:02:22 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:03.753 19:02:22 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:03.753 19:02:22 -- setup/common.sh@18 -- # local node=1 00:04:03.753 19:02:22 -- setup/common.sh@19 -- # local var val 00:04:03.753 19:02:22 -- setup/common.sh@20 -- # local mem_f mem 00:04:03.753 19:02:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:03.753 19:02:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:03.753 19:02:22 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:03.753 19:02:22 -- setup/common.sh@28 -- # mapfile -t mem 00:04:03.753 19:02:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 23651584 kB' 'MemUsed: 4046820 kB' 'SwapCached: 0 kB' 'Active: 1353892 kB' 'Inactive: 156848 kB' 'Active(anon): 1214216 kB' 'Inactive(anon): 0 kB' 'Active(file): 139676 kB' 'Inactive(file): 156848 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1353532 kB' 'Mapped: 17852 kB' 'AnonPages: 157260 kB' 'Shmem: 1057008 kB' 'KernelStack: 8808 kB' 'PageTables: 2112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 75828 kB' 'Slab: 423544 kB' 'SReclaimable: 75828 kB' 'SUnreclaim: 347716 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.753 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.753 19:02:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # continue 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # IFS=': ' 00:04:03.754 19:02:22 -- setup/common.sh@31 -- # read -r var val _ 00:04:03.754 19:02:22 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:03.754 19:02:22 -- setup/common.sh@33 -- # echo 0 00:04:03.754 19:02:22 -- setup/common.sh@33 -- # return 0 00:04:03.754 19:02:22 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.754 19:02:22 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.754 19:02:22 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.754 19:02:22 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:03.754 node0=512 expecting 512 00:04:03.754 19:02:22 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:03.754 19:02:22 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:03.754 19:02:22 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:03.754 19:02:22 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:03.754 node1=512 expecting 512 00:04:03.754 19:02:22 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:03.754 00:04:03.754 real 0m3.705s 00:04:03.754 user 0m1.427s 00:04:03.754 sys 0m2.345s 00:04:03.754 19:02:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:03.754 19:02:22 -- common/autotest_common.sh@10 -- # set +x 00:04:03.754 ************************************ 00:04:03.754 END TEST even_2G_alloc 00:04:03.754 ************************************ 00:04:03.754 19:02:22 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:03.754 19:02:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:03.754 19:02:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:03.754 19:02:22 -- common/autotest_common.sh@10 -- # set +x 00:04:03.754 ************************************ 00:04:03.754 START TEST odd_alloc 00:04:03.754 ************************************ 00:04:03.754 19:02:22 -- common/autotest_common.sh@1114 -- # odd_alloc 00:04:03.754 19:02:22 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:03.754 19:02:22 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:03.754 19:02:22 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:03.754 19:02:22 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:03.754 19:02:22 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:03.754 19:02:22 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:03.754 19:02:22 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:03.754 19:02:22 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:03.754 19:02:22 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:03.754 19:02:22 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:03.754 19:02:22 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:03.754 19:02:22 -- setup/hugepages.sh@83 -- # : 513 00:04:03.754 19:02:22 -- setup/hugepages.sh@84 -- # : 1 00:04:03.754 19:02:22 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:03.754 19:02:22 -- setup/hugepages.sh@83 -- # : 0 00:04:03.754 19:02:22 -- setup/hugepages.sh@84 -- # : 0 00:04:03.754 19:02:22 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:03.754 19:02:22 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:03.754 19:02:22 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:03.754 19:02:22 -- setup/hugepages.sh@160 -- # setup output 00:04:03.754 19:02:22 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:03.754 19:02:22 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:07.047 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.047 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.047 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.047 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.047 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.047 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.047 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.047 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.365 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:07.365 19:02:25 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:07.365 19:02:25 -- setup/hugepages.sh@89 -- # local node 00:04:07.365 19:02:25 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:07.365 19:02:25 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:07.365 19:02:25 -- setup/hugepages.sh@92 -- # local surp 00:04:07.365 19:02:25 -- setup/hugepages.sh@93 -- # local resv 00:04:07.365 19:02:25 -- setup/hugepages.sh@94 -- # local anon 00:04:07.365 19:02:25 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:07.365 19:02:25 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:07.365 19:02:25 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:07.365 19:02:25 -- setup/common.sh@18 -- # local node= 00:04:07.365 19:02:25 -- setup/common.sh@19 -- # local var val 00:04:07.365 19:02:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.365 19:02:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.365 19:02:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.365 19:02:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.365 19:02:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.365 19:02:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.365 19:02:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43057608 kB' 'MemAvailable: 46777764 kB' 'Buffers: 8940 kB' 'Cached: 11206020 kB' 'SwapCached: 0 kB' 'Active: 7971028 kB' 'Inactive: 3688336 kB' 'Active(anon): 7553500 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 447716 kB' 'Mapped: 167276 kB' 'Shmem: 7109096 kB' 'KReclaimable: 222316 kB' 'Slab: 908644 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686328 kB' 'KernelStack: 21872 kB' 'PageTables: 7624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 8778196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214512 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.365 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.365 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.366 19:02:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.366 19:02:25 -- setup/common.sh@33 -- # echo 0 00:04:07.366 19:02:25 -- setup/common.sh@33 -- # return 0 00:04:07.366 19:02:25 -- setup/hugepages.sh@97 -- # anon=0 00:04:07.366 19:02:25 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:07.366 19:02:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.366 19:02:25 -- setup/common.sh@18 -- # local node= 00:04:07.366 19:02:25 -- setup/common.sh@19 -- # local var val 00:04:07.366 19:02:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.366 19:02:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.366 19:02:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.366 19:02:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.366 19:02:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.366 19:02:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.366 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43058504 kB' 'MemAvailable: 46778660 kB' 'Buffers: 8940 kB' 'Cached: 11206024 kB' 'SwapCached: 0 kB' 'Active: 7970988 kB' 'Inactive: 3688336 kB' 'Active(anon): 7553460 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 447652 kB' 'Mapped: 167224 kB' 'Shmem: 7109100 kB' 'KReclaimable: 222316 kB' 'Slab: 908672 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686356 kB' 'KernelStack: 21952 kB' 'PageTables: 7628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 8776692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214496 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.367 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.367 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.368 19:02:25 -- setup/common.sh@33 -- # echo 0 00:04:07.368 19:02:25 -- setup/common.sh@33 -- # return 0 00:04:07.368 19:02:25 -- setup/hugepages.sh@99 -- # surp=0 00:04:07.368 19:02:25 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:07.368 19:02:25 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:07.368 19:02:25 -- setup/common.sh@18 -- # local node= 00:04:07.368 19:02:25 -- setup/common.sh@19 -- # local var val 00:04:07.368 19:02:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.368 19:02:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.368 19:02:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.368 19:02:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.368 19:02:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.368 19:02:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43059720 kB' 'MemAvailable: 46779876 kB' 'Buffers: 8940 kB' 'Cached: 11206036 kB' 'SwapCached: 0 kB' 'Active: 7971720 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554192 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448324 kB' 'Mapped: 167148 kB' 'Shmem: 7109112 kB' 'KReclaimable: 222316 kB' 'Slab: 908656 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686340 kB' 'KernelStack: 21920 kB' 'PageTables: 7828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 8778220 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214544 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.368 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.368 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.369 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.369 19:02:25 -- setup/common.sh@33 -- # echo 0 00:04:07.369 19:02:25 -- setup/common.sh@33 -- # return 0 00:04:07.369 19:02:25 -- setup/hugepages.sh@100 -- # resv=0 00:04:07.369 19:02:25 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:07.369 nr_hugepages=1025 00:04:07.369 19:02:25 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:07.369 resv_hugepages=0 00:04:07.369 19:02:25 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:07.369 surplus_hugepages=0 00:04:07.369 19:02:25 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:07.369 anon_hugepages=0 00:04:07.369 19:02:25 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:07.369 19:02:25 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:07.369 19:02:25 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:07.369 19:02:25 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:07.369 19:02:25 -- setup/common.sh@18 -- # local node= 00:04:07.369 19:02:25 -- setup/common.sh@19 -- # local var val 00:04:07.369 19:02:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.369 19:02:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.369 19:02:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.369 19:02:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.369 19:02:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.369 19:02:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.369 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43059492 kB' 'MemAvailable: 46779648 kB' 'Buffers: 8940 kB' 'Cached: 11206052 kB' 'SwapCached: 0 kB' 'Active: 7971236 kB' 'Inactive: 3688336 kB' 'Active(anon): 7553708 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 447828 kB' 'Mapped: 167148 kB' 'Shmem: 7109128 kB' 'KReclaimable: 222316 kB' 'Slab: 908624 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686308 kB' 'KernelStack: 21968 kB' 'PageTables: 7412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 8778236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214560 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.370 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.370 19:02:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.371 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.371 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.371 19:02:25 -- setup/common.sh@33 -- # echo 1025 00:04:07.371 19:02:25 -- setup/common.sh@33 -- # return 0 00:04:07.371 19:02:25 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:07.371 19:02:25 -- setup/hugepages.sh@112 -- # get_nodes 00:04:07.371 19:02:25 -- setup/hugepages.sh@27 -- # local node 00:04:07.371 19:02:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.371 19:02:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:07.371 19:02:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.371 19:02:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:07.371 19:02:25 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:07.371 19:02:25 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:07.371 19:02:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.371 19:02:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.371 19:02:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:07.371 19:02:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.371 19:02:25 -- setup/common.sh@18 -- # local node=0 00:04:07.371 19:02:25 -- setup/common.sh@19 -- # local var val 00:04:07.371 19:02:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.371 19:02:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.371 19:02:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:07.371 19:02:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:07.371 19:02:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.371 19:02:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19413668 kB' 'MemUsed: 13171700 kB' 'SwapCached: 0 kB' 'Active: 6617184 kB' 'Inactive: 3531488 kB' 'Active(anon): 6339332 kB' 'Inactive(anon): 0 kB' 'Active(file): 277852 kB' 'Inactive(file): 3531488 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9861400 kB' 'Mapped: 149296 kB' 'AnonPages: 290452 kB' 'Shmem: 6052060 kB' 'KernelStack: 13032 kB' 'PageTables: 5428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146488 kB' 'Slab: 484964 kB' 'SReclaimable: 146488 kB' 'SUnreclaim: 338476 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.633 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.633 19:02:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@33 -- # echo 0 00:04:07.634 19:02:25 -- setup/common.sh@33 -- # return 0 00:04:07.634 19:02:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.634 19:02:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.634 19:02:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.634 19:02:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:07.634 19:02:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.634 19:02:25 -- setup/common.sh@18 -- # local node=1 00:04:07.634 19:02:25 -- setup/common.sh@19 -- # local var val 00:04:07.634 19:02:25 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.634 19:02:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.634 19:02:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:07.634 19:02:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:07.634 19:02:25 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.634 19:02:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.634 19:02:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 23659540 kB' 'MemUsed: 4038864 kB' 'SwapCached: 0 kB' 'Active: 1354064 kB' 'Inactive: 156848 kB' 'Active(anon): 1214388 kB' 'Inactive(anon): 0 kB' 'Active(file): 139676 kB' 'Inactive(file): 156848 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1353620 kB' 'Mapped: 17852 kB' 'AnonPages: 157308 kB' 'Shmem: 1057096 kB' 'KernelStack: 8728 kB' 'PageTables: 2048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 75828 kB' 'Slab: 423736 kB' 'SReclaimable: 75828 kB' 'SUnreclaim: 347908 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.634 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.634 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # continue 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.635 19:02:25 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.635 19:02:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.635 19:02:25 -- setup/common.sh@33 -- # echo 0 00:04:07.635 19:02:25 -- setup/common.sh@33 -- # return 0 00:04:07.635 19:02:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.635 19:02:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.635 19:02:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.635 19:02:26 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.635 19:02:26 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:07.635 node0=512 expecting 513 00:04:07.635 19:02:26 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.635 19:02:26 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.635 19:02:26 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.635 19:02:26 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:07.635 node1=513 expecting 512 00:04:07.635 19:02:26 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:07.635 00:04:07.635 real 0m3.764s 00:04:07.635 user 0m1.467s 00:04:07.635 sys 0m2.365s 00:04:07.635 19:02:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:07.635 19:02:26 -- common/autotest_common.sh@10 -- # set +x 00:04:07.635 ************************************ 00:04:07.635 END TEST odd_alloc 00:04:07.635 ************************************ 00:04:07.635 19:02:26 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:07.635 19:02:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:07.635 19:02:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:07.635 19:02:26 -- common/autotest_common.sh@10 -- # set +x 00:04:07.635 ************************************ 00:04:07.635 START TEST custom_alloc 00:04:07.635 ************************************ 00:04:07.635 19:02:26 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:07.635 19:02:26 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:07.635 19:02:26 -- setup/hugepages.sh@169 -- # local node 00:04:07.635 19:02:26 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:07.635 19:02:26 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:07.635 19:02:26 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:07.635 19:02:26 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:07.635 19:02:26 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:07.635 19:02:26 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:07.635 19:02:26 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:07.635 19:02:26 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:07.635 19:02:26 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:07.635 19:02:26 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:07.635 19:02:26 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:07.635 19:02:26 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:07.635 19:02:26 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:07.635 19:02:26 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:07.635 19:02:26 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:07.635 19:02:26 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:07.635 19:02:26 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:07.635 19:02:26 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.635 19:02:26 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:07.635 19:02:26 -- setup/hugepages.sh@83 -- # : 256 00:04:07.635 19:02:26 -- setup/hugepages.sh@84 -- # : 1 00:04:07.635 19:02:26 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.635 19:02:26 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:07.635 19:02:26 -- setup/hugepages.sh@83 -- # : 0 00:04:07.636 19:02:26 -- setup/hugepages.sh@84 -- # : 0 00:04:07.636 19:02:26 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:07.636 19:02:26 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:07.636 19:02:26 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:07.636 19:02:26 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:07.636 19:02:26 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:07.636 19:02:26 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:07.636 19:02:26 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:07.636 19:02:26 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:07.636 19:02:26 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:07.636 19:02:26 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:07.636 19:02:26 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:07.636 19:02:26 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:07.636 19:02:26 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:07.636 19:02:26 -- setup/hugepages.sh@78 -- # return 0 00:04:07.636 19:02:26 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:07.636 19:02:26 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:07.636 19:02:26 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:07.636 19:02:26 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:07.636 19:02:26 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:07.636 19:02:26 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:07.636 19:02:26 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:07.636 19:02:26 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:07.636 19:02:26 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:07.636 19:02:26 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:07.636 19:02:26 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:07.636 19:02:26 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:07.636 19:02:26 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:07.636 19:02:26 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:07.636 19:02:26 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:07.636 19:02:26 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:07.636 19:02:26 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:07.636 19:02:26 -- setup/hugepages.sh@78 -- # return 0 00:04:07.636 19:02:26 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:07.636 19:02:26 -- setup/hugepages.sh@187 -- # setup output 00:04:07.636 19:02:26 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:07.636 19:02:26 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:10.932 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:10.932 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:10.932 19:02:29 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:10.932 19:02:29 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:10.932 19:02:29 -- setup/hugepages.sh@89 -- # local node 00:04:10.932 19:02:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:10.932 19:02:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:10.932 19:02:29 -- setup/hugepages.sh@92 -- # local surp 00:04:10.932 19:02:29 -- setup/hugepages.sh@93 -- # local resv 00:04:10.932 19:02:29 -- setup/hugepages.sh@94 -- # local anon 00:04:10.932 19:02:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:10.932 19:02:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:10.932 19:02:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:10.932 19:02:29 -- setup/common.sh@18 -- # local node= 00:04:10.932 19:02:29 -- setup/common.sh@19 -- # local var val 00:04:10.932 19:02:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.932 19:02:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.932 19:02:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.932 19:02:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.932 19:02:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.932 19:02:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 42021600 kB' 'MemAvailable: 45741756 kB' 'Buffers: 8940 kB' 'Cached: 11206152 kB' 'SwapCached: 0 kB' 'Active: 7972208 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554680 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448708 kB' 'Mapped: 167280 kB' 'Shmem: 7109228 kB' 'KReclaimable: 222316 kB' 'Slab: 909128 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686812 kB' 'KernelStack: 21840 kB' 'PageTables: 7520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 8774604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214480 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.932 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.932 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:10.933 19:02:29 -- setup/common.sh@33 -- # echo 0 00:04:10.933 19:02:29 -- setup/common.sh@33 -- # return 0 00:04:10.933 19:02:29 -- setup/hugepages.sh@97 -- # anon=0 00:04:10.933 19:02:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:10.933 19:02:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:10.933 19:02:29 -- setup/common.sh@18 -- # local node= 00:04:10.933 19:02:29 -- setup/common.sh@19 -- # local var val 00:04:10.933 19:02:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.933 19:02:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.933 19:02:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.933 19:02:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.933 19:02:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.933 19:02:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 42021496 kB' 'MemAvailable: 45741652 kB' 'Buffers: 8940 kB' 'Cached: 11206156 kB' 'SwapCached: 0 kB' 'Active: 7971900 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554372 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448392 kB' 'Mapped: 167172 kB' 'Shmem: 7109232 kB' 'KReclaimable: 222316 kB' 'Slab: 909092 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686776 kB' 'KernelStack: 21840 kB' 'PageTables: 7492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 8774616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.933 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.933 19:02:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.934 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:10.934 19:02:29 -- setup/common.sh@33 -- # echo 0 00:04:10.934 19:02:29 -- setup/common.sh@33 -- # return 0 00:04:10.934 19:02:29 -- setup/hugepages.sh@99 -- # surp=0 00:04:10.934 19:02:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:10.934 19:02:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:10.934 19:02:29 -- setup/common.sh@18 -- # local node= 00:04:10.934 19:02:29 -- setup/common.sh@19 -- # local var val 00:04:10.934 19:02:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.934 19:02:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.934 19:02:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.934 19:02:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.934 19:02:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.934 19:02:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.934 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 42022548 kB' 'MemAvailable: 45742704 kB' 'Buffers: 8940 kB' 'Cached: 11206168 kB' 'SwapCached: 0 kB' 'Active: 7971912 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554384 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448380 kB' 'Mapped: 167172 kB' 'Shmem: 7109244 kB' 'KReclaimable: 222316 kB' 'Slab: 909004 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686688 kB' 'KernelStack: 21824 kB' 'PageTables: 7432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 8774632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.935 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.935 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:10.936 19:02:29 -- setup/common.sh@33 -- # echo 0 00:04:10.936 19:02:29 -- setup/common.sh@33 -- # return 0 00:04:10.936 19:02:29 -- setup/hugepages.sh@100 -- # resv=0 00:04:10.936 19:02:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:10.936 nr_hugepages=1536 00:04:10.936 19:02:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:10.936 resv_hugepages=0 00:04:10.936 19:02:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:10.936 surplus_hugepages=0 00:04:10.936 19:02:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:10.936 anon_hugepages=0 00:04:10.936 19:02:29 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:10.936 19:02:29 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:10.936 19:02:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:10.936 19:02:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:10.936 19:02:29 -- setup/common.sh@18 -- # local node= 00:04:10.936 19:02:29 -- setup/common.sh@19 -- # local var val 00:04:10.936 19:02:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:10.936 19:02:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:10.936 19:02:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:10.936 19:02:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:10.936 19:02:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:10.936 19:02:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 42022804 kB' 'MemAvailable: 45742960 kB' 'Buffers: 8940 kB' 'Cached: 11206184 kB' 'SwapCached: 0 kB' 'Active: 7971932 kB' 'Inactive: 3688336 kB' 'Active(anon): 7554404 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 448384 kB' 'Mapped: 167172 kB' 'Shmem: 7109260 kB' 'KReclaimable: 222316 kB' 'Slab: 909004 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686688 kB' 'KernelStack: 21824 kB' 'PageTables: 7432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 8774648 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:10.936 19:02:29 -- setup/common.sh@32 -- # continue 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:10.936 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.199 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.199 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.200 19:02:29 -- setup/common.sh@33 -- # echo 1536 00:04:11.200 19:02:29 -- setup/common.sh@33 -- # return 0 00:04:11.200 19:02:29 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:11.200 19:02:29 -- setup/hugepages.sh@112 -- # get_nodes 00:04:11.200 19:02:29 -- setup/hugepages.sh@27 -- # local node 00:04:11.200 19:02:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.200 19:02:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:11.200 19:02:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.200 19:02:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:11.200 19:02:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:11.200 19:02:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:11.200 19:02:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:11.200 19:02:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:11.200 19:02:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:11.200 19:02:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.200 19:02:29 -- setup/common.sh@18 -- # local node=0 00:04:11.200 19:02:29 -- setup/common.sh@19 -- # local var val 00:04:11.200 19:02:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.200 19:02:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.200 19:02:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:11.200 19:02:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:11.200 19:02:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.200 19:02:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19405976 kB' 'MemUsed: 13179392 kB' 'SwapCached: 0 kB' 'Active: 6616952 kB' 'Inactive: 3531488 kB' 'Active(anon): 6339100 kB' 'Inactive(anon): 0 kB' 'Active(file): 277852 kB' 'Inactive(file): 3531488 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9861416 kB' 'Mapped: 149320 kB' 'AnonPages: 290136 kB' 'Shmem: 6052076 kB' 'KernelStack: 12968 kB' 'PageTables: 5180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146488 kB' 'Slab: 485028 kB' 'SReclaimable: 146488 kB' 'SUnreclaim: 338540 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.200 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.200 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@33 -- # echo 0 00:04:11.201 19:02:29 -- setup/common.sh@33 -- # return 0 00:04:11.201 19:02:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:11.201 19:02:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:11.201 19:02:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:11.201 19:02:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:11.201 19:02:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.201 19:02:29 -- setup/common.sh@18 -- # local node=1 00:04:11.201 19:02:29 -- setup/common.sh@19 -- # local var val 00:04:11.201 19:02:29 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.201 19:02:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.201 19:02:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:11.201 19:02:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:11.201 19:02:29 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.201 19:02:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698404 kB' 'MemFree: 22617196 kB' 'MemUsed: 5081208 kB' 'SwapCached: 0 kB' 'Active: 1355024 kB' 'Inactive: 156848 kB' 'Active(anon): 1215348 kB' 'Inactive(anon): 0 kB' 'Active(file): 139676 kB' 'Inactive(file): 156848 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1353736 kB' 'Mapped: 17852 kB' 'AnonPages: 158256 kB' 'Shmem: 1057212 kB' 'KernelStack: 8856 kB' 'PageTables: 2252 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 75828 kB' 'Slab: 423976 kB' 'SReclaimable: 75828 kB' 'SUnreclaim: 348148 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.201 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.201 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # continue 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.202 19:02:29 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.202 19:02:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.202 19:02:29 -- setup/common.sh@33 -- # echo 0 00:04:11.202 19:02:29 -- setup/common.sh@33 -- # return 0 00:04:11.202 19:02:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:11.202 19:02:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:11.202 19:02:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:11.202 19:02:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:11.202 19:02:29 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:11.202 node0=512 expecting 512 00:04:11.202 19:02:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:11.202 19:02:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:11.202 19:02:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:11.202 19:02:29 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:11.202 node1=1024 expecting 1024 00:04:11.202 19:02:29 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:11.202 00:04:11.202 real 0m3.549s 00:04:11.202 user 0m1.327s 00:04:11.202 sys 0m2.260s 00:04:11.202 19:02:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:11.202 19:02:29 -- common/autotest_common.sh@10 -- # set +x 00:04:11.202 ************************************ 00:04:11.202 END TEST custom_alloc 00:04:11.202 ************************************ 00:04:11.202 19:02:29 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:11.202 19:02:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:11.202 19:02:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:11.202 19:02:29 -- common/autotest_common.sh@10 -- # set +x 00:04:11.202 ************************************ 00:04:11.202 START TEST no_shrink_alloc 00:04:11.202 ************************************ 00:04:11.202 19:02:29 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:11.202 19:02:29 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:11.202 19:02:29 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:11.202 19:02:29 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:11.202 19:02:29 -- setup/hugepages.sh@51 -- # shift 00:04:11.202 19:02:29 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:11.202 19:02:29 -- setup/hugepages.sh@52 -- # local node_ids 00:04:11.202 19:02:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:11.202 19:02:29 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:11.202 19:02:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:11.202 19:02:29 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:11.202 19:02:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:11.202 19:02:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:11.202 19:02:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:11.202 19:02:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:11.202 19:02:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:11.202 19:02:29 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:11.202 19:02:29 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:11.202 19:02:29 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:11.202 19:02:29 -- setup/hugepages.sh@73 -- # return 0 00:04:11.202 19:02:29 -- setup/hugepages.sh@198 -- # setup output 00:04:11.202 19:02:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.202 19:02:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:14.497 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:14.497 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:14.762 19:02:33 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:14.762 19:02:33 -- setup/hugepages.sh@89 -- # local node 00:04:14.762 19:02:33 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:14.762 19:02:33 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:14.762 19:02:33 -- setup/hugepages.sh@92 -- # local surp 00:04:14.762 19:02:33 -- setup/hugepages.sh@93 -- # local resv 00:04:14.762 19:02:33 -- setup/hugepages.sh@94 -- # local anon 00:04:14.762 19:02:33 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:14.762 19:02:33 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:14.762 19:02:33 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:14.762 19:02:33 -- setup/common.sh@18 -- # local node= 00:04:14.762 19:02:33 -- setup/common.sh@19 -- # local var val 00:04:14.762 19:02:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.762 19:02:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.762 19:02:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.762 19:02:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.762 19:02:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.762 19:02:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43089524 kB' 'MemAvailable: 46809680 kB' 'Buffers: 8940 kB' 'Cached: 11206280 kB' 'SwapCached: 0 kB' 'Active: 7974160 kB' 'Inactive: 3688336 kB' 'Active(anon): 7556632 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 450276 kB' 'Mapped: 167288 kB' 'Shmem: 7109356 kB' 'KReclaimable: 222316 kB' 'Slab: 908380 kB' 'SReclaimable: 222316 kB' 'SUnreclaim: 686064 kB' 'KernelStack: 21888 kB' 'PageTables: 7596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8779800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.762 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.762 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:14.763 19:02:33 -- setup/common.sh@33 -- # echo 0 00:04:14.763 19:02:33 -- setup/common.sh@33 -- # return 0 00:04:14.763 19:02:33 -- setup/hugepages.sh@97 -- # anon=0 00:04:14.763 19:02:33 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:14.763 19:02:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.763 19:02:33 -- setup/common.sh@18 -- # local node= 00:04:14.763 19:02:33 -- setup/common.sh@19 -- # local var val 00:04:14.763 19:02:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.763 19:02:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.763 19:02:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.763 19:02:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.763 19:02:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.763 19:02:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43093020 kB' 'MemAvailable: 46813160 kB' 'Buffers: 8940 kB' 'Cached: 11206284 kB' 'SwapCached: 0 kB' 'Active: 7972924 kB' 'Inactive: 3688336 kB' 'Active(anon): 7555396 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 449480 kB' 'Mapped: 167156 kB' 'Shmem: 7109360 kB' 'KReclaimable: 222284 kB' 'Slab: 908200 kB' 'SReclaimable: 222284 kB' 'SUnreclaim: 685916 kB' 'KernelStack: 21920 kB' 'PageTables: 7320 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8778296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.763 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.763 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.764 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.764 19:02:33 -- setup/common.sh@33 -- # echo 0 00:04:14.764 19:02:33 -- setup/common.sh@33 -- # return 0 00:04:14.764 19:02:33 -- setup/hugepages.sh@99 -- # surp=0 00:04:14.764 19:02:33 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:14.764 19:02:33 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:14.764 19:02:33 -- setup/common.sh@18 -- # local node= 00:04:14.764 19:02:33 -- setup/common.sh@19 -- # local var val 00:04:14.764 19:02:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.764 19:02:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.764 19:02:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.764 19:02:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.764 19:02:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.764 19:02:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.764 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43097944 kB' 'MemAvailable: 46818084 kB' 'Buffers: 8940 kB' 'Cached: 11206296 kB' 'SwapCached: 0 kB' 'Active: 7974752 kB' 'Inactive: 3688336 kB' 'Active(anon): 7557224 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 451244 kB' 'Mapped: 167676 kB' 'Shmem: 7109372 kB' 'KReclaimable: 222284 kB' 'Slab: 908200 kB' 'SReclaimable: 222284 kB' 'SUnreclaim: 685916 kB' 'KernelStack: 22016 kB' 'PageTables: 7916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8779560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214512 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.765 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.765 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:14.766 19:02:33 -- setup/common.sh@33 -- # echo 0 00:04:14.766 19:02:33 -- setup/common.sh@33 -- # return 0 00:04:14.766 19:02:33 -- setup/hugepages.sh@100 -- # resv=0 00:04:14.766 19:02:33 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:14.766 nr_hugepages=1024 00:04:14.766 19:02:33 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:14.766 resv_hugepages=0 00:04:14.766 19:02:33 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:14.766 surplus_hugepages=0 00:04:14.766 19:02:33 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:14.766 anon_hugepages=0 00:04:14.766 19:02:33 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:14.766 19:02:33 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:14.766 19:02:33 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:14.766 19:02:33 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:14.766 19:02:33 -- setup/common.sh@18 -- # local node= 00:04:14.766 19:02:33 -- setup/common.sh@19 -- # local var val 00:04:14.766 19:02:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.766 19:02:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.766 19:02:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:14.766 19:02:33 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:14.766 19:02:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.766 19:02:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43095092 kB' 'MemAvailable: 46815232 kB' 'Buffers: 8940 kB' 'Cached: 11206316 kB' 'SwapCached: 0 kB' 'Active: 7978792 kB' 'Inactive: 3688336 kB' 'Active(anon): 7561264 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 455280 kB' 'Mapped: 167676 kB' 'Shmem: 7109392 kB' 'KReclaimable: 222284 kB' 'Slab: 908136 kB' 'SReclaimable: 222284 kB' 'SUnreclaim: 685852 kB' 'KernelStack: 22112 kB' 'PageTables: 8056 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8785596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214532 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.766 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.766 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.767 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.767 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:14.767 19:02:33 -- setup/common.sh@33 -- # echo 1024 00:04:14.767 19:02:33 -- setup/common.sh@33 -- # return 0 00:04:14.767 19:02:33 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:14.767 19:02:33 -- setup/hugepages.sh@112 -- # get_nodes 00:04:14.767 19:02:33 -- setup/hugepages.sh@27 -- # local node 00:04:14.767 19:02:33 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.767 19:02:33 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:14.767 19:02:33 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:14.767 19:02:33 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:14.767 19:02:33 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:14.767 19:02:33 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:14.767 19:02:33 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:14.767 19:02:33 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:14.767 19:02:33 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:14.768 19:02:33 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:14.768 19:02:33 -- setup/common.sh@18 -- # local node=0 00:04:14.768 19:02:33 -- setup/common.sh@19 -- # local var val 00:04:14.768 19:02:33 -- setup/common.sh@20 -- # local mem_f mem 00:04:14.768 19:02:33 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:14.768 19:02:33 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:14.768 19:02:33 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:14.768 19:02:33 -- setup/common.sh@28 -- # mapfile -t mem 00:04:14.768 19:02:33 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18363804 kB' 'MemUsed: 14221564 kB' 'SwapCached: 0 kB' 'Active: 6619320 kB' 'Inactive: 3531488 kB' 'Active(anon): 6341468 kB' 'Inactive(anon): 0 kB' 'Active(file): 277852 kB' 'Inactive(file): 3531488 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9861428 kB' 'Mapped: 149808 kB' 'AnonPages: 292600 kB' 'Shmem: 6052088 kB' 'KernelStack: 13080 kB' 'PageTables: 5620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146488 kB' 'Slab: 484456 kB' 'SReclaimable: 146488 kB' 'SUnreclaim: 337968 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:14.768 19:02:33 -- setup/common.sh@32 -- # continue 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:14.768 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # continue 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # continue 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # continue 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # continue 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # continue 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # continue 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # IFS=': ' 00:04:15.028 19:02:33 -- setup/common.sh@31 -- # read -r var val _ 00:04:15.028 19:02:33 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:15.028 19:02:33 -- setup/common.sh@33 -- # echo 0 00:04:15.028 19:02:33 -- setup/common.sh@33 -- # return 0 00:04:15.028 19:02:33 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:15.028 19:02:33 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:15.028 19:02:33 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:15.028 19:02:33 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:15.028 19:02:33 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:15.028 node0=1024 expecting 1024 00:04:15.028 19:02:33 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:15.028 19:02:33 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:15.028 19:02:33 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:15.028 19:02:33 -- setup/hugepages.sh@202 -- # setup output 00:04:15.028 19:02:33 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:15.028 19:02:33 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:18.319 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:18.319 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:18.319 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:18.319 19:02:36 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:18.319 19:02:36 -- setup/hugepages.sh@89 -- # local node 00:04:18.319 19:02:36 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:18.319 19:02:36 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:18.319 19:02:36 -- setup/hugepages.sh@92 -- # local surp 00:04:18.319 19:02:36 -- setup/hugepages.sh@93 -- # local resv 00:04:18.319 19:02:36 -- setup/hugepages.sh@94 -- # local anon 00:04:18.319 19:02:36 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:18.319 19:02:36 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:18.319 19:02:36 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:18.319 19:02:36 -- setup/common.sh@18 -- # local node= 00:04:18.319 19:02:36 -- setup/common.sh@19 -- # local var val 00:04:18.319 19:02:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.319 19:02:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.319 19:02:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.319 19:02:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.319 19:02:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.319 19:02:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.319 19:02:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43115464 kB' 'MemAvailable: 46835604 kB' 'Buffers: 8940 kB' 'Cached: 11206412 kB' 'SwapCached: 0 kB' 'Active: 7973620 kB' 'Inactive: 3688336 kB' 'Active(anon): 7556092 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 449896 kB' 'Mapped: 167304 kB' 'Shmem: 7109488 kB' 'KReclaimable: 222284 kB' 'Slab: 908220 kB' 'SReclaimable: 222284 kB' 'SUnreclaim: 685936 kB' 'KernelStack: 21776 kB' 'PageTables: 7204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8775664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:18.319 19:02:36 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.319 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.319 19:02:36 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.319 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.319 19:02:36 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.319 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.319 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.319 19:02:36 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:18.320 19:02:36 -- setup/common.sh@33 -- # echo 0 00:04:18.320 19:02:36 -- setup/common.sh@33 -- # return 0 00:04:18.320 19:02:36 -- setup/hugepages.sh@97 -- # anon=0 00:04:18.320 19:02:36 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:18.320 19:02:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.320 19:02:36 -- setup/common.sh@18 -- # local node= 00:04:18.320 19:02:36 -- setup/common.sh@19 -- # local var val 00:04:18.320 19:02:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.320 19:02:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.320 19:02:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.320 19:02:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.320 19:02:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.320 19:02:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43116612 kB' 'MemAvailable: 46836752 kB' 'Buffers: 8940 kB' 'Cached: 11206420 kB' 'SwapCached: 0 kB' 'Active: 7973844 kB' 'Inactive: 3688336 kB' 'Active(anon): 7556316 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 450180 kB' 'Mapped: 167164 kB' 'Shmem: 7109496 kB' 'KReclaimable: 222284 kB' 'Slab: 908180 kB' 'SReclaimable: 222284 kB' 'SUnreclaim: 685896 kB' 'KernelStack: 21840 kB' 'PageTables: 7436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8776048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.320 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.320 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.321 19:02:36 -- setup/common.sh@33 -- # echo 0 00:04:18.321 19:02:36 -- setup/common.sh@33 -- # return 0 00:04:18.321 19:02:36 -- setup/hugepages.sh@99 -- # surp=0 00:04:18.321 19:02:36 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:18.321 19:02:36 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:18.321 19:02:36 -- setup/common.sh@18 -- # local node= 00:04:18.321 19:02:36 -- setup/common.sh@19 -- # local var val 00:04:18.321 19:02:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.321 19:02:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.321 19:02:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.321 19:02:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.321 19:02:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.321 19:02:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43117860 kB' 'MemAvailable: 46838000 kB' 'Buffers: 8940 kB' 'Cached: 11206432 kB' 'SwapCached: 0 kB' 'Active: 7973860 kB' 'Inactive: 3688336 kB' 'Active(anon): 7556332 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 450180 kB' 'Mapped: 167164 kB' 'Shmem: 7109508 kB' 'KReclaimable: 222284 kB' 'Slab: 908180 kB' 'SReclaimable: 222284 kB' 'SUnreclaim: 685896 kB' 'KernelStack: 21840 kB' 'PageTables: 7436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8776060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.321 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.321 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.322 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.322 19:02:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:18.583 19:02:36 -- setup/common.sh@33 -- # echo 0 00:04:18.583 19:02:36 -- setup/common.sh@33 -- # return 0 00:04:18.583 19:02:36 -- setup/hugepages.sh@100 -- # resv=0 00:04:18.583 19:02:36 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:18.583 nr_hugepages=1024 00:04:18.583 19:02:36 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:18.583 resv_hugepages=0 00:04:18.583 19:02:36 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:18.583 surplus_hugepages=0 00:04:18.583 19:02:36 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:18.583 anon_hugepages=0 00:04:18.583 19:02:36 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:18.583 19:02:36 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:18.583 19:02:36 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:18.583 19:02:36 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:18.583 19:02:36 -- setup/common.sh@18 -- # local node= 00:04:18.583 19:02:36 -- setup/common.sh@19 -- # local var val 00:04:18.583 19:02:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.583 19:02:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.583 19:02:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:18.583 19:02:36 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:18.583 19:02:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.583 19:02:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.583 19:02:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283772 kB' 'MemFree: 43118172 kB' 'MemAvailable: 46838312 kB' 'Buffers: 8940 kB' 'Cached: 11206460 kB' 'SwapCached: 0 kB' 'Active: 7973524 kB' 'Inactive: 3688336 kB' 'Active(anon): 7555996 kB' 'Inactive(anon): 0 kB' 'Active(file): 417528 kB' 'Inactive(file): 3688336 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 449792 kB' 'Mapped: 167164 kB' 'Shmem: 7109536 kB' 'KReclaimable: 222284 kB' 'Slab: 908180 kB' 'SReclaimable: 222284 kB' 'SUnreclaim: 685896 kB' 'KernelStack: 21824 kB' 'PageTables: 7388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 8776076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 425332 kB' 'DirectMap2M: 9746432 kB' 'DirectMap1G: 59768832 kB' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.583 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.583 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:18.584 19:02:36 -- setup/common.sh@33 -- # echo 1024 00:04:18.584 19:02:36 -- setup/common.sh@33 -- # return 0 00:04:18.584 19:02:36 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:18.584 19:02:36 -- setup/hugepages.sh@112 -- # get_nodes 00:04:18.584 19:02:36 -- setup/hugepages.sh@27 -- # local node 00:04:18.584 19:02:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.584 19:02:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:18.584 19:02:36 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:18.584 19:02:36 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:18.584 19:02:36 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:18.584 19:02:36 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:18.584 19:02:36 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:18.584 19:02:36 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:18.584 19:02:36 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:18.584 19:02:36 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:18.584 19:02:36 -- setup/common.sh@18 -- # local node=0 00:04:18.584 19:02:36 -- setup/common.sh@19 -- # local var val 00:04:18.584 19:02:36 -- setup/common.sh@20 -- # local mem_f mem 00:04:18.584 19:02:36 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:18.584 19:02:36 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:18.584 19:02:36 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:18.584 19:02:36 -- setup/common.sh@28 -- # mapfile -t mem 00:04:18.584 19:02:36 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18380312 kB' 'MemUsed: 14205056 kB' 'SwapCached: 0 kB' 'Active: 6617316 kB' 'Inactive: 3531488 kB' 'Active(anon): 6339464 kB' 'Inactive(anon): 0 kB' 'Active(file): 277852 kB' 'Inactive(file): 3531488 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9861440 kB' 'Mapped: 149312 kB' 'AnonPages: 290572 kB' 'Shmem: 6052100 kB' 'KernelStack: 13000 kB' 'PageTables: 5224 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 146488 kB' 'Slab: 484516 kB' 'SReclaimable: 146488 kB' 'SUnreclaim: 338028 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.584 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.584 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # continue 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # IFS=': ' 00:04:18.585 19:02:36 -- setup/common.sh@31 -- # read -r var val _ 00:04:18.585 19:02:36 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:18.585 19:02:36 -- setup/common.sh@33 -- # echo 0 00:04:18.585 19:02:36 -- setup/common.sh@33 -- # return 0 00:04:18.585 19:02:36 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:18.585 19:02:36 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:18.585 19:02:36 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:18.585 19:02:36 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:18.585 19:02:36 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:18.585 node0=1024 expecting 1024 00:04:18.585 19:02:36 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:18.585 00:04:18.585 real 0m7.343s 00:04:18.585 user 0m2.758s 00:04:18.585 sys 0m4.724s 00:04:18.585 19:02:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:18.585 19:02:36 -- common/autotest_common.sh@10 -- # set +x 00:04:18.585 ************************************ 00:04:18.585 END TEST no_shrink_alloc 00:04:18.585 ************************************ 00:04:18.585 19:02:37 -- setup/hugepages.sh@217 -- # clear_hp 00:04:18.585 19:02:37 -- setup/hugepages.sh@37 -- # local node hp 00:04:18.585 19:02:37 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:18.585 19:02:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:18.585 19:02:37 -- setup/hugepages.sh@41 -- # echo 0 00:04:18.585 19:02:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:18.585 19:02:37 -- setup/hugepages.sh@41 -- # echo 0 00:04:18.585 19:02:37 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:18.585 19:02:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:18.585 19:02:37 -- setup/hugepages.sh@41 -- # echo 0 00:04:18.585 19:02:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:18.585 19:02:37 -- setup/hugepages.sh@41 -- # echo 0 00:04:18.585 19:02:37 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:18.585 19:02:37 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:18.585 00:04:18.585 real 0m27.172s 00:04:18.585 user 0m9.659s 00:04:18.585 sys 0m16.345s 00:04:18.585 19:02:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:18.585 19:02:37 -- common/autotest_common.sh@10 -- # set +x 00:04:18.585 ************************************ 00:04:18.585 END TEST hugepages 00:04:18.585 ************************************ 00:04:18.585 19:02:37 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:18.585 19:02:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:18.585 19:02:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:18.585 19:02:37 -- common/autotest_common.sh@10 -- # set +x 00:04:18.585 ************************************ 00:04:18.585 START TEST driver 00:04:18.585 ************************************ 00:04:18.585 19:02:37 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:18.846 * Looking for test storage... 00:04:18.846 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:18.846 19:02:37 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:18.846 19:02:37 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:18.846 19:02:37 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:18.846 19:02:37 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:18.846 19:02:37 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:18.846 19:02:37 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:18.846 19:02:37 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:18.846 19:02:37 -- scripts/common.sh@335 -- # IFS=.-: 00:04:18.846 19:02:37 -- scripts/common.sh@335 -- # read -ra ver1 00:04:18.846 19:02:37 -- scripts/common.sh@336 -- # IFS=.-: 00:04:18.846 19:02:37 -- scripts/common.sh@336 -- # read -ra ver2 00:04:18.846 19:02:37 -- scripts/common.sh@337 -- # local 'op=<' 00:04:18.846 19:02:37 -- scripts/common.sh@339 -- # ver1_l=2 00:04:18.846 19:02:37 -- scripts/common.sh@340 -- # ver2_l=1 00:04:18.846 19:02:37 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:18.846 19:02:37 -- scripts/common.sh@343 -- # case "$op" in 00:04:18.846 19:02:37 -- scripts/common.sh@344 -- # : 1 00:04:18.846 19:02:37 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:18.846 19:02:37 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:18.846 19:02:37 -- scripts/common.sh@364 -- # decimal 1 00:04:18.846 19:02:37 -- scripts/common.sh@352 -- # local d=1 00:04:18.846 19:02:37 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:18.846 19:02:37 -- scripts/common.sh@354 -- # echo 1 00:04:18.846 19:02:37 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:18.846 19:02:37 -- scripts/common.sh@365 -- # decimal 2 00:04:18.846 19:02:37 -- scripts/common.sh@352 -- # local d=2 00:04:18.846 19:02:37 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:18.846 19:02:37 -- scripts/common.sh@354 -- # echo 2 00:04:18.846 19:02:37 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:18.846 19:02:37 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:18.846 19:02:37 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:18.846 19:02:37 -- scripts/common.sh@367 -- # return 0 00:04:18.846 19:02:37 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:18.846 19:02:37 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:18.846 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.846 --rc genhtml_branch_coverage=1 00:04:18.846 --rc genhtml_function_coverage=1 00:04:18.846 --rc genhtml_legend=1 00:04:18.846 --rc geninfo_all_blocks=1 00:04:18.846 --rc geninfo_unexecuted_blocks=1 00:04:18.846 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.846 ' 00:04:18.846 19:02:37 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:18.846 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.846 --rc genhtml_branch_coverage=1 00:04:18.846 --rc genhtml_function_coverage=1 00:04:18.846 --rc genhtml_legend=1 00:04:18.846 --rc geninfo_all_blocks=1 00:04:18.846 --rc geninfo_unexecuted_blocks=1 00:04:18.846 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.846 ' 00:04:18.846 19:02:37 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:18.846 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.846 --rc genhtml_branch_coverage=1 00:04:18.846 --rc genhtml_function_coverage=1 00:04:18.846 --rc genhtml_legend=1 00:04:18.846 --rc geninfo_all_blocks=1 00:04:18.846 --rc geninfo_unexecuted_blocks=1 00:04:18.846 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.846 ' 00:04:18.846 19:02:37 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:18.846 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:18.846 --rc genhtml_branch_coverage=1 00:04:18.846 --rc genhtml_function_coverage=1 00:04:18.846 --rc genhtml_legend=1 00:04:18.846 --rc geninfo_all_blocks=1 00:04:18.846 --rc geninfo_unexecuted_blocks=1 00:04:18.846 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:18.846 ' 00:04:18.846 19:02:37 -- setup/driver.sh@68 -- # setup reset 00:04:18.846 19:02:37 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:18.846 19:02:37 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:24.128 19:02:42 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:24.128 19:02:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:24.128 19:02:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:24.128 19:02:42 -- common/autotest_common.sh@10 -- # set +x 00:04:24.128 ************************************ 00:04:24.128 START TEST guess_driver 00:04:24.128 ************************************ 00:04:24.128 19:02:42 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:24.128 19:02:42 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:24.128 19:02:42 -- setup/driver.sh@47 -- # local fail=0 00:04:24.128 19:02:42 -- setup/driver.sh@49 -- # pick_driver 00:04:24.128 19:02:42 -- setup/driver.sh@36 -- # vfio 00:04:24.128 19:02:42 -- setup/driver.sh@21 -- # local iommu_grups 00:04:24.128 19:02:42 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:24.128 19:02:42 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:24.128 19:02:42 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:24.128 19:02:42 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:24.128 19:02:42 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:24.128 19:02:42 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:24.128 19:02:42 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:24.128 19:02:42 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:24.128 19:02:42 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:24.128 19:02:42 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:24.128 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:24.128 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:24.128 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:24.128 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:24.128 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:24.128 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:24.128 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:24.128 19:02:42 -- setup/driver.sh@30 -- # return 0 00:04:24.128 19:02:42 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:24.128 19:02:42 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:24.128 19:02:42 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:24.128 19:02:42 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:24.128 Looking for driver=vfio-pci 00:04:24.128 19:02:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:24.128 19:02:42 -- setup/driver.sh@45 -- # setup output config 00:04:24.128 19:02:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.128 19:02:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:27.516 19:02:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:27.516 19:02:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:27.516 19:02:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:28.908 19:02:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:28.908 19:02:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:28.908 19:02:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:28.908 19:02:47 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:28.908 19:02:47 -- setup/driver.sh@65 -- # setup reset 00:04:28.908 19:02:47 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:28.908 19:02:47 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:34.185 00:04:34.185 real 0m9.846s 00:04:34.185 user 0m2.633s 00:04:34.185 sys 0m4.929s 00:04:34.185 19:02:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:34.185 19:02:51 -- common/autotest_common.sh@10 -- # set +x 00:04:34.185 ************************************ 00:04:34.185 END TEST guess_driver 00:04:34.185 ************************************ 00:04:34.185 00:04:34.185 real 0m14.941s 00:04:34.185 user 0m4.057s 00:04:34.185 sys 0m7.842s 00:04:34.185 19:02:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:34.185 19:02:52 -- common/autotest_common.sh@10 -- # set +x 00:04:34.185 ************************************ 00:04:34.185 END TEST driver 00:04:34.185 ************************************ 00:04:34.185 19:02:52 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:34.185 19:02:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:34.185 19:02:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:34.185 19:02:52 -- common/autotest_common.sh@10 -- # set +x 00:04:34.185 ************************************ 00:04:34.185 START TEST devices 00:04:34.185 ************************************ 00:04:34.185 19:02:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:34.185 * Looking for test storage... 00:04:34.185 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:34.185 19:02:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:34.185 19:02:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:34.185 19:02:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:34.185 19:02:52 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:34.185 19:02:52 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:34.185 19:02:52 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:34.185 19:02:52 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:34.185 19:02:52 -- scripts/common.sh@335 -- # IFS=.-: 00:04:34.185 19:02:52 -- scripts/common.sh@335 -- # read -ra ver1 00:04:34.185 19:02:52 -- scripts/common.sh@336 -- # IFS=.-: 00:04:34.185 19:02:52 -- scripts/common.sh@336 -- # read -ra ver2 00:04:34.185 19:02:52 -- scripts/common.sh@337 -- # local 'op=<' 00:04:34.185 19:02:52 -- scripts/common.sh@339 -- # ver1_l=2 00:04:34.185 19:02:52 -- scripts/common.sh@340 -- # ver2_l=1 00:04:34.185 19:02:52 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:34.185 19:02:52 -- scripts/common.sh@343 -- # case "$op" in 00:04:34.185 19:02:52 -- scripts/common.sh@344 -- # : 1 00:04:34.185 19:02:52 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:34.185 19:02:52 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:34.185 19:02:52 -- scripts/common.sh@364 -- # decimal 1 00:04:34.186 19:02:52 -- scripts/common.sh@352 -- # local d=1 00:04:34.186 19:02:52 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:34.186 19:02:52 -- scripts/common.sh@354 -- # echo 1 00:04:34.186 19:02:52 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:34.186 19:02:52 -- scripts/common.sh@365 -- # decimal 2 00:04:34.186 19:02:52 -- scripts/common.sh@352 -- # local d=2 00:04:34.186 19:02:52 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:34.186 19:02:52 -- scripts/common.sh@354 -- # echo 2 00:04:34.186 19:02:52 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:34.186 19:02:52 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:34.186 19:02:52 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:34.186 19:02:52 -- scripts/common.sh@367 -- # return 0 00:04:34.186 19:02:52 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:34.186 19:02:52 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:34.186 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.186 --rc genhtml_branch_coverage=1 00:04:34.186 --rc genhtml_function_coverage=1 00:04:34.186 --rc genhtml_legend=1 00:04:34.186 --rc geninfo_all_blocks=1 00:04:34.186 --rc geninfo_unexecuted_blocks=1 00:04:34.186 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.186 ' 00:04:34.186 19:02:52 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:34.186 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.186 --rc genhtml_branch_coverage=1 00:04:34.186 --rc genhtml_function_coverage=1 00:04:34.186 --rc genhtml_legend=1 00:04:34.186 --rc geninfo_all_blocks=1 00:04:34.186 --rc geninfo_unexecuted_blocks=1 00:04:34.186 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.186 ' 00:04:34.186 19:02:52 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:34.186 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.186 --rc genhtml_branch_coverage=1 00:04:34.186 --rc genhtml_function_coverage=1 00:04:34.186 --rc genhtml_legend=1 00:04:34.186 --rc geninfo_all_blocks=1 00:04:34.186 --rc geninfo_unexecuted_blocks=1 00:04:34.186 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.186 ' 00:04:34.186 19:02:52 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:34.186 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:34.186 --rc genhtml_branch_coverage=1 00:04:34.186 --rc genhtml_function_coverage=1 00:04:34.186 --rc genhtml_legend=1 00:04:34.186 --rc geninfo_all_blocks=1 00:04:34.186 --rc geninfo_unexecuted_blocks=1 00:04:34.186 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:34.186 ' 00:04:34.186 19:02:52 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:34.186 19:02:52 -- setup/devices.sh@192 -- # setup reset 00:04:34.186 19:02:52 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:34.186 19:02:52 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:37.480 19:02:55 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:37.480 19:02:55 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:37.480 19:02:55 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:37.480 19:02:55 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:37.480 19:02:55 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:37.480 19:02:55 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:37.480 19:02:55 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:37.480 19:02:55 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:37.480 19:02:55 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:37.480 19:02:55 -- setup/devices.sh@196 -- # blocks=() 00:04:37.480 19:02:55 -- setup/devices.sh@196 -- # declare -a blocks 00:04:37.480 19:02:55 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:37.480 19:02:55 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:37.480 19:02:55 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:37.480 19:02:55 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:37.480 19:02:56 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:37.480 19:02:56 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:37.480 19:02:56 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:37.480 19:02:56 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:37.480 19:02:56 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:37.480 19:02:56 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:37.480 19:02:56 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:37.480 No valid GPT data, bailing 00:04:37.480 19:02:56 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:37.480 19:02:56 -- scripts/common.sh@393 -- # pt= 00:04:37.480 19:02:56 -- scripts/common.sh@394 -- # return 1 00:04:37.480 19:02:56 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:37.480 19:02:56 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:37.480 19:02:56 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:37.480 19:02:56 -- setup/common.sh@80 -- # echo 1600321314816 00:04:37.480 19:02:56 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:37.480 19:02:56 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:37.480 19:02:56 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:37.480 19:02:56 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:37.480 19:02:56 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:37.480 19:02:56 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:37.480 19:02:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:37.480 19:02:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:37.480 19:02:56 -- common/autotest_common.sh@10 -- # set +x 00:04:37.480 ************************************ 00:04:37.480 START TEST nvme_mount 00:04:37.480 ************************************ 00:04:37.480 19:02:56 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:37.480 19:02:56 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:37.480 19:02:56 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:37.480 19:02:56 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:37.480 19:02:56 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:37.480 19:02:56 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:37.480 19:02:56 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:37.480 19:02:56 -- setup/common.sh@40 -- # local part_no=1 00:04:37.480 19:02:56 -- setup/common.sh@41 -- # local size=1073741824 00:04:37.480 19:02:56 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:37.480 19:02:56 -- setup/common.sh@44 -- # parts=() 00:04:37.480 19:02:56 -- setup/common.sh@44 -- # local parts 00:04:37.480 19:02:56 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:37.480 19:02:56 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:37.480 19:02:56 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:37.480 19:02:56 -- setup/common.sh@46 -- # (( part++ )) 00:04:37.480 19:02:56 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:37.738 19:02:56 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:37.738 19:02:56 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:37.738 19:02:56 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:38.676 Creating new GPT entries in memory. 00:04:38.676 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:38.676 other utilities. 00:04:38.676 19:02:57 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:38.676 19:02:57 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:38.676 19:02:57 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:38.676 19:02:57 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:38.676 19:02:57 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:39.615 Creating new GPT entries in memory. 00:04:39.615 The operation has completed successfully. 00:04:39.615 19:02:58 -- setup/common.sh@57 -- # (( part++ )) 00:04:39.615 19:02:58 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:39.615 19:02:58 -- setup/common.sh@62 -- # wait 1257992 00:04:39.615 19:02:58 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.615 19:02:58 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:39.615 19:02:58 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.615 19:02:58 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:39.615 19:02:58 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:39.874 19:02:58 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.874 19:02:58 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.874 19:02:58 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:39.874 19:02:58 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:39.874 19:02:58 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.874 19:02:58 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.874 19:02:58 -- setup/devices.sh@53 -- # local found=0 00:04:39.874 19:02:58 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:39.874 19:02:58 -- setup/devices.sh@56 -- # : 00:04:39.874 19:02:58 -- setup/devices.sh@59 -- # local pci status 00:04:39.874 19:02:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.874 19:02:58 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:39.874 19:02:58 -- setup/devices.sh@47 -- # setup output config 00:04:39.874 19:02:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.874 19:02:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:43.181 19:03:01 -- setup/devices.sh@63 -- # found=1 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.181 19:03:01 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:43.181 19:03:01 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:43.181 19:03:01 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.181 19:03:01 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:43.181 19:03:01 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:43.181 19:03:01 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:43.181 19:03:01 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.181 19:03:01 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.181 19:03:01 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:43.181 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:43.181 19:03:01 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:43.181 19:03:01 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:43.441 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:43.441 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:43.441 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:43.441 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:43.441 19:03:01 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:43.441 19:03:01 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:43.441 19:03:01 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.441 19:03:01 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:43.441 19:03:01 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:43.441 19:03:01 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.441 19:03:01 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:43.441 19:03:01 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:43.441 19:03:01 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:43.441 19:03:01 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.441 19:03:01 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:43.441 19:03:01 -- setup/devices.sh@53 -- # local found=0 00:04:43.441 19:03:01 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:43.441 19:03:01 -- setup/devices.sh@56 -- # : 00:04:43.441 19:03:01 -- setup/devices.sh@59 -- # local pci status 00:04:43.441 19:03:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.441 19:03:01 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:43.441 19:03:01 -- setup/devices.sh@47 -- # setup output config 00:04:43.441 19:03:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.441 19:03:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:46.735 19:03:04 -- setup/devices.sh@63 -- # found=1 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.735 19:03:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:05 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:46.735 19:03:05 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:46.735 19:03:05 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.735 19:03:05 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:46.735 19:03:05 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.735 19:03:05 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.735 19:03:05 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:46.735 19:03:05 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:46.735 19:03:05 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:46.735 19:03:05 -- setup/devices.sh@50 -- # local mount_point= 00:04:46.735 19:03:05 -- setup/devices.sh@51 -- # local test_file= 00:04:46.735 19:03:05 -- setup/devices.sh@53 -- # local found=0 00:04:46.735 19:03:05 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:46.735 19:03:05 -- setup/devices.sh@59 -- # local pci status 00:04:46.735 19:03:05 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.735 19:03:05 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:46.735 19:03:05 -- setup/devices.sh@47 -- # setup output config 00:04:46.735 19:03:05 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.735 19:03:05 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:50.027 19:03:08 -- setup/devices.sh@63 -- # found=1 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.027 19:03:08 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:50.027 19:03:08 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:50.027 19:03:08 -- setup/devices.sh@68 -- # return 0 00:04:50.027 19:03:08 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:50.027 19:03:08 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.027 19:03:08 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:50.027 19:03:08 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:50.027 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:50.027 00:04:50.027 real 0m12.428s 00:04:50.027 user 0m3.609s 00:04:50.027 sys 0m6.739s 00:04:50.027 19:03:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:50.027 19:03:08 -- common/autotest_common.sh@10 -- # set +x 00:04:50.027 ************************************ 00:04:50.027 END TEST nvme_mount 00:04:50.027 ************************************ 00:04:50.027 19:03:08 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:50.027 19:03:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:50.027 19:03:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:50.027 19:03:08 -- common/autotest_common.sh@10 -- # set +x 00:04:50.027 ************************************ 00:04:50.027 START TEST dm_mount 00:04:50.027 ************************************ 00:04:50.027 19:03:08 -- common/autotest_common.sh@1114 -- # dm_mount 00:04:50.027 19:03:08 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:50.027 19:03:08 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:50.027 19:03:08 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:50.027 19:03:08 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:50.027 19:03:08 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:50.027 19:03:08 -- setup/common.sh@40 -- # local part_no=2 00:04:50.027 19:03:08 -- setup/common.sh@41 -- # local size=1073741824 00:04:50.027 19:03:08 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:50.027 19:03:08 -- setup/common.sh@44 -- # parts=() 00:04:50.027 19:03:08 -- setup/common.sh@44 -- # local parts 00:04:50.027 19:03:08 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:50.027 19:03:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:50.027 19:03:08 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:50.027 19:03:08 -- setup/common.sh@46 -- # (( part++ )) 00:04:50.027 19:03:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:50.027 19:03:08 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:50.027 19:03:08 -- setup/common.sh@46 -- # (( part++ )) 00:04:50.027 19:03:08 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:50.027 19:03:08 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:50.027 19:03:08 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:50.027 19:03:08 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:50.965 Creating new GPT entries in memory. 00:04:50.965 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:50.965 other utilities. 00:04:50.965 19:03:09 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:50.965 19:03:09 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:50.965 19:03:09 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:50.965 19:03:09 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:50.965 19:03:09 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:52.345 Creating new GPT entries in memory. 00:04:52.345 The operation has completed successfully. 00:04:52.345 19:03:10 -- setup/common.sh@57 -- # (( part++ )) 00:04:52.345 19:03:10 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:52.345 19:03:10 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:52.345 19:03:10 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:52.345 19:03:10 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:53.283 The operation has completed successfully. 00:04:53.283 19:03:11 -- setup/common.sh@57 -- # (( part++ )) 00:04:53.283 19:03:11 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:53.283 19:03:11 -- setup/common.sh@62 -- # wait 1263052 00:04:53.283 19:03:11 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:53.283 19:03:11 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:53.283 19:03:11 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:53.283 19:03:11 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:53.283 19:03:11 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:53.283 19:03:11 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:53.283 19:03:11 -- setup/devices.sh@161 -- # break 00:04:53.283 19:03:11 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:53.283 19:03:11 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:53.283 19:03:11 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:53.283 19:03:11 -- setup/devices.sh@166 -- # dm=dm-0 00:04:53.283 19:03:11 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:53.283 19:03:11 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:53.283 19:03:11 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:53.283 19:03:11 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:53.283 19:03:11 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:53.283 19:03:11 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:53.283 19:03:11 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:53.283 19:03:11 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:53.283 19:03:11 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:53.283 19:03:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:53.283 19:03:11 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:53.283 19:03:11 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:53.283 19:03:11 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:53.283 19:03:11 -- setup/devices.sh@53 -- # local found=0 00:04:53.283 19:03:11 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:53.283 19:03:11 -- setup/devices.sh@56 -- # : 00:04:53.283 19:03:11 -- setup/devices.sh@59 -- # local pci status 00:04:53.283 19:03:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.283 19:03:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:53.283 19:03:11 -- setup/devices.sh@47 -- # setup output config 00:04:53.283 19:03:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:53.283 19:03:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:56.573 19:03:14 -- setup/devices.sh@63 -- # found=1 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.573 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.573 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.574 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.574 19:03:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.574 19:03:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.574 19:03:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.574 19:03:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.574 19:03:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.574 19:03:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.574 19:03:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.574 19:03:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:56.574 19:03:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.574 19:03:15 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:56.574 19:03:15 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:56.574 19:03:15 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.574 19:03:15 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:56.574 19:03:15 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.574 19:03:15 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.832 19:03:15 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:56.832 19:03:15 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:56.832 19:03:15 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:56.832 19:03:15 -- setup/devices.sh@50 -- # local mount_point= 00:04:56.832 19:03:15 -- setup/devices.sh@51 -- # local test_file= 00:04:56.832 19:03:15 -- setup/devices.sh@53 -- # local found=0 00:04:56.832 19:03:15 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:56.832 19:03:15 -- setup/devices.sh@59 -- # local pci status 00:04:56.832 19:03:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.832 19:03:15 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:56.832 19:03:15 -- setup/devices.sh@47 -- # setup output config 00:04:56.832 19:03:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.832 19:03:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:00.125 19:03:18 -- setup/devices.sh@63 -- # found=1 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.125 19:03:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.125 19:03:18 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:00.125 19:03:18 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:00.125 19:03:18 -- setup/devices.sh@68 -- # return 0 00:05:00.125 19:03:18 -- setup/devices.sh@187 -- # cleanup_dm 00:05:00.125 19:03:18 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:00.125 19:03:18 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:00.126 19:03:18 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:00.126 19:03:18 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:00.126 19:03:18 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:00.126 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:00.126 19:03:18 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:00.126 19:03:18 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:00.126 00:05:00.126 real 0m10.046s 00:05:00.126 user 0m2.461s 00:05:00.126 sys 0m4.691s 00:05:00.126 19:03:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:00.126 19:03:18 -- common/autotest_common.sh@10 -- # set +x 00:05:00.126 ************************************ 00:05:00.126 END TEST dm_mount 00:05:00.126 ************************************ 00:05:00.126 19:03:18 -- setup/devices.sh@1 -- # cleanup 00:05:00.126 19:03:18 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:00.126 19:03:18 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:00.126 19:03:18 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:00.126 19:03:18 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:00.126 19:03:18 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:00.126 19:03:18 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:00.385 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:00.385 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:00.385 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:00.385 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:00.385 19:03:18 -- setup/devices.sh@12 -- # cleanup_dm 00:05:00.385 19:03:18 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:00.385 19:03:18 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:00.385 19:03:18 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:00.385 19:03:18 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:00.385 19:03:18 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:00.385 19:03:18 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:00.385 00:05:00.385 real 0m26.849s 00:05:00.385 user 0m7.597s 00:05:00.385 sys 0m14.209s 00:05:00.385 19:03:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:00.385 19:03:18 -- common/autotest_common.sh@10 -- # set +x 00:05:00.385 ************************************ 00:05:00.385 END TEST devices 00:05:00.385 ************************************ 00:05:00.385 00:05:00.385 real 1m33.647s 00:05:00.385 user 0m29.262s 00:05:00.385 sys 0m53.315s 00:05:00.385 19:03:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:00.385 19:03:18 -- common/autotest_common.sh@10 -- # set +x 00:05:00.385 ************************************ 00:05:00.385 END TEST setup.sh 00:05:00.385 ************************************ 00:05:00.644 19:03:19 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:03.936 Hugepages 00:05:03.936 node hugesize free / total 00:05:03.936 node0 1048576kB 0 / 0 00:05:03.936 node0 2048kB 2048 / 2048 00:05:03.936 node1 1048576kB 0 / 0 00:05:03.936 node1 2048kB 0 / 0 00:05:03.936 00:05:03.936 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:03.936 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:03.936 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:03.936 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:03.936 19:03:22 -- spdk/autotest.sh@128 -- # uname -s 00:05:03.936 19:03:22 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:05:03.936 19:03:22 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:05:03.936 19:03:22 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:07.229 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:07.229 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:07.489 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:07.489 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:07.489 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:07.489 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:07.489 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:07.489 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:09.396 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:09.396 19:03:27 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:10.334 19:03:28 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:10.334 19:03:28 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:10.334 19:03:28 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:10.334 19:03:28 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:10.334 19:03:28 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:10.334 19:03:28 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:10.334 19:03:28 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:10.334 19:03:28 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:10.334 19:03:28 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:10.334 19:03:28 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:10.334 19:03:28 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:10.334 19:03:28 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:13.626 Waiting for block devices as requested 00:05:13.626 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:13.626 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:13.626 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:13.886 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:13.886 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:13.886 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:13.886 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:14.146 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:14.146 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:14.146 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:14.406 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:14.406 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:14.406 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:14.666 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:14.666 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:14.666 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:14.926 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:14.926 19:03:33 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:14.926 19:03:33 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:14.926 19:03:33 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:14.926 19:03:33 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:14.926 19:03:33 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:14.926 19:03:33 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:14.926 19:03:33 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:14.926 19:03:33 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:14.926 19:03:33 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:14.926 19:03:33 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:14.926 19:03:33 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:14.926 19:03:33 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:14.926 19:03:33 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:14.926 19:03:33 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:14.926 19:03:33 -- common/autotest_common.sh@1552 -- # continue 00:05:14.926 19:03:33 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:14.926 19:03:33 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:14.926 19:03:33 -- common/autotest_common.sh@10 -- # set +x 00:05:15.186 19:03:33 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:15.186 19:03:33 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:15.186 19:03:33 -- common/autotest_common.sh@10 -- # set +x 00:05:15.186 19:03:33 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:18.481 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:18.481 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:19.862 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:19.862 19:03:38 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:19.862 19:03:38 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:19.862 19:03:38 -- common/autotest_common.sh@10 -- # set +x 00:05:19.862 19:03:38 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:19.862 19:03:38 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:19.862 19:03:38 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:19.862 19:03:38 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:19.862 19:03:38 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:19.862 19:03:38 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:19.862 19:03:38 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:19.862 19:03:38 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:19.862 19:03:38 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:19.862 19:03:38 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:19.862 19:03:38 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:20.122 19:03:38 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:20.122 19:03:38 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:20.122 19:03:38 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:20.122 19:03:38 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:20.122 19:03:38 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:20.122 19:03:38 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:20.122 19:03:38 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:20.122 19:03:38 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:20.122 19:03:38 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:20.122 19:03:38 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=1272972 00:05:20.122 19:03:38 -- common/autotest_common.sh@1593 -- # waitforlisten 1272972 00:05:20.122 19:03:38 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:20.122 19:03:38 -- common/autotest_common.sh@829 -- # '[' -z 1272972 ']' 00:05:20.122 19:03:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:20.122 19:03:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:20.122 19:03:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:20.122 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:20.122 19:03:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:20.122 19:03:38 -- common/autotest_common.sh@10 -- # set +x 00:05:20.122 [2024-11-18 19:03:38.588797] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:20.122 [2024-11-18 19:03:38.588866] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1272972 ] 00:05:20.122 EAL: No free 2048 kB hugepages reported on node 1 00:05:20.122 [2024-11-18 19:03:38.657266] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:20.382 [2024-11-18 19:03:38.728394] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:20.382 [2024-11-18 19:03:38.728500] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:20.950 19:03:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:20.950 19:03:39 -- common/autotest_common.sh@862 -- # return 0 00:05:20.950 19:03:39 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:20.950 19:03:39 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:20.950 19:03:39 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:24.241 nvme0n1 00:05:24.241 19:03:42 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:24.241 [2024-11-18 19:03:42.572782] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:24.241 request: 00:05:24.241 { 00:05:24.241 "nvme_ctrlr_name": "nvme0", 00:05:24.241 "password": "test", 00:05:24.241 "method": "bdev_nvme_opal_revert", 00:05:24.241 "req_id": 1 00:05:24.241 } 00:05:24.241 Got JSON-RPC error response 00:05:24.241 response: 00:05:24.241 { 00:05:24.241 "code": -32602, 00:05:24.241 "message": "Invalid parameters" 00:05:24.241 } 00:05:24.241 19:03:42 -- common/autotest_common.sh@1599 -- # true 00:05:24.241 19:03:42 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:24.241 19:03:42 -- common/autotest_common.sh@1603 -- # killprocess 1272972 00:05:24.241 19:03:42 -- common/autotest_common.sh@936 -- # '[' -z 1272972 ']' 00:05:24.241 19:03:42 -- common/autotest_common.sh@940 -- # kill -0 1272972 00:05:24.241 19:03:42 -- common/autotest_common.sh@941 -- # uname 00:05:24.241 19:03:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:24.241 19:03:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1272972 00:05:24.241 19:03:42 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:24.241 19:03:42 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:24.241 19:03:42 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1272972' 00:05:24.241 killing process with pid 1272972 00:05:24.241 19:03:42 -- common/autotest_common.sh@955 -- # kill 1272972 00:05:24.241 19:03:42 -- common/autotest_common.sh@960 -- # wait 1272972 00:05:26.780 19:03:44 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:26.780 19:03:44 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:26.780 19:03:44 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:26.780 19:03:44 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:26.780 19:03:44 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:26.780 19:03:44 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:26.780 19:03:44 -- common/autotest_common.sh@10 -- # set +x 00:05:26.780 19:03:44 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:26.780 19:03:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.780 19:03:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.780 19:03:44 -- common/autotest_common.sh@10 -- # set +x 00:05:26.780 ************************************ 00:05:26.780 START TEST env 00:05:26.780 ************************************ 00:05:26.780 19:03:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:26.780 * Looking for test storage... 00:05:26.780 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:26.780 19:03:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:26.780 19:03:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:26.780 19:03:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:26.780 19:03:44 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:26.780 19:03:44 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:26.780 19:03:44 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:26.780 19:03:44 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:26.780 19:03:44 -- scripts/common.sh@335 -- # IFS=.-: 00:05:26.780 19:03:44 -- scripts/common.sh@335 -- # read -ra ver1 00:05:26.780 19:03:44 -- scripts/common.sh@336 -- # IFS=.-: 00:05:26.780 19:03:44 -- scripts/common.sh@336 -- # read -ra ver2 00:05:26.780 19:03:44 -- scripts/common.sh@337 -- # local 'op=<' 00:05:26.780 19:03:44 -- scripts/common.sh@339 -- # ver1_l=2 00:05:26.780 19:03:44 -- scripts/common.sh@340 -- # ver2_l=1 00:05:26.780 19:03:44 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:26.780 19:03:44 -- scripts/common.sh@343 -- # case "$op" in 00:05:26.780 19:03:44 -- scripts/common.sh@344 -- # : 1 00:05:26.780 19:03:44 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:26.780 19:03:44 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:26.780 19:03:44 -- scripts/common.sh@364 -- # decimal 1 00:05:26.780 19:03:44 -- scripts/common.sh@352 -- # local d=1 00:05:26.780 19:03:44 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:26.780 19:03:44 -- scripts/common.sh@354 -- # echo 1 00:05:26.780 19:03:44 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:26.780 19:03:44 -- scripts/common.sh@365 -- # decimal 2 00:05:26.780 19:03:44 -- scripts/common.sh@352 -- # local d=2 00:05:26.780 19:03:44 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:26.780 19:03:44 -- scripts/common.sh@354 -- # echo 2 00:05:26.780 19:03:44 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:26.780 19:03:44 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:26.780 19:03:44 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:26.780 19:03:44 -- scripts/common.sh@367 -- # return 0 00:05:26.780 19:03:44 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:26.780 19:03:44 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:26.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:26.780 --rc genhtml_branch_coverage=1 00:05:26.780 --rc genhtml_function_coverage=1 00:05:26.780 --rc genhtml_legend=1 00:05:26.780 --rc geninfo_all_blocks=1 00:05:26.780 --rc geninfo_unexecuted_blocks=1 00:05:26.780 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:26.780 ' 00:05:26.780 19:03:44 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:26.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:26.780 --rc genhtml_branch_coverage=1 00:05:26.780 --rc genhtml_function_coverage=1 00:05:26.780 --rc genhtml_legend=1 00:05:26.780 --rc geninfo_all_blocks=1 00:05:26.780 --rc geninfo_unexecuted_blocks=1 00:05:26.780 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:26.780 ' 00:05:26.780 19:03:44 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:26.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:26.780 --rc genhtml_branch_coverage=1 00:05:26.780 --rc genhtml_function_coverage=1 00:05:26.780 --rc genhtml_legend=1 00:05:26.780 --rc geninfo_all_blocks=1 00:05:26.780 --rc geninfo_unexecuted_blocks=1 00:05:26.780 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:26.780 ' 00:05:26.780 19:03:44 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:26.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:26.780 --rc genhtml_branch_coverage=1 00:05:26.780 --rc genhtml_function_coverage=1 00:05:26.780 --rc genhtml_legend=1 00:05:26.780 --rc geninfo_all_blocks=1 00:05:26.780 --rc geninfo_unexecuted_blocks=1 00:05:26.780 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:26.780 ' 00:05:26.780 19:03:44 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:26.780 19:03:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.780 19:03:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.780 19:03:44 -- common/autotest_common.sh@10 -- # set +x 00:05:26.780 ************************************ 00:05:26.780 START TEST env_memory 00:05:26.780 ************************************ 00:05:26.780 19:03:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:26.780 00:05:26.780 00:05:26.780 CUnit - A unit testing framework for C - Version 2.1-3 00:05:26.780 http://cunit.sourceforge.net/ 00:05:26.780 00:05:26.780 00:05:26.780 Suite: memory 00:05:26.780 Test: alloc and free memory map ...[2024-11-18 19:03:45.007215] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:26.780 passed 00:05:26.780 Test: mem map translation ...[2024-11-18 19:03:45.020068] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:26.780 [2024-11-18 19:03:45.020087] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:26.780 [2024-11-18 19:03:45.020119] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:26.780 [2024-11-18 19:03:45.020127] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:26.780 passed 00:05:26.780 Test: mem map registration ...[2024-11-18 19:03:45.041294] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:26.780 [2024-11-18 19:03:45.041310] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:26.780 passed 00:05:26.780 Test: mem map adjacent registrations ...passed 00:05:26.780 00:05:26.780 Run Summary: Type Total Ran Passed Failed Inactive 00:05:26.780 suites 1 1 n/a 0 0 00:05:26.780 tests 4 4 4 0 0 00:05:26.781 asserts 152 152 152 0 n/a 00:05:26.781 00:05:26.781 Elapsed time = 0.084 seconds 00:05:26.781 00:05:26.781 real 0m0.097s 00:05:26.781 user 0m0.083s 00:05:26.781 sys 0m0.013s 00:05:26.781 19:03:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:26.781 19:03:45 -- common/autotest_common.sh@10 -- # set +x 00:05:26.781 ************************************ 00:05:26.781 END TEST env_memory 00:05:26.781 ************************************ 00:05:26.781 19:03:45 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:26.781 19:03:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.781 19:03:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.781 19:03:45 -- common/autotest_common.sh@10 -- # set +x 00:05:26.781 ************************************ 00:05:26.781 START TEST env_vtophys 00:05:26.781 ************************************ 00:05:26.781 19:03:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:26.781 EAL: lib.eal log level changed from notice to debug 00:05:26.781 EAL: Detected lcore 0 as core 0 on socket 0 00:05:26.781 EAL: Detected lcore 1 as core 1 on socket 0 00:05:26.781 EAL: Detected lcore 2 as core 2 on socket 0 00:05:26.781 EAL: Detected lcore 3 as core 3 on socket 0 00:05:26.781 EAL: Detected lcore 4 as core 4 on socket 0 00:05:26.781 EAL: Detected lcore 5 as core 5 on socket 0 00:05:26.781 EAL: Detected lcore 6 as core 6 on socket 0 00:05:26.781 EAL: Detected lcore 7 as core 8 on socket 0 00:05:26.781 EAL: Detected lcore 8 as core 9 on socket 0 00:05:26.781 EAL: Detected lcore 9 as core 10 on socket 0 00:05:26.781 EAL: Detected lcore 10 as core 11 on socket 0 00:05:26.781 EAL: Detected lcore 11 as core 12 on socket 0 00:05:26.781 EAL: Detected lcore 12 as core 13 on socket 0 00:05:26.781 EAL: Detected lcore 13 as core 14 on socket 0 00:05:26.781 EAL: Detected lcore 14 as core 16 on socket 0 00:05:26.781 EAL: Detected lcore 15 as core 17 on socket 0 00:05:26.781 EAL: Detected lcore 16 as core 18 on socket 0 00:05:26.781 EAL: Detected lcore 17 as core 19 on socket 0 00:05:26.781 EAL: Detected lcore 18 as core 20 on socket 0 00:05:26.781 EAL: Detected lcore 19 as core 21 on socket 0 00:05:26.781 EAL: Detected lcore 20 as core 22 on socket 0 00:05:26.781 EAL: Detected lcore 21 as core 24 on socket 0 00:05:26.781 EAL: Detected lcore 22 as core 25 on socket 0 00:05:26.781 EAL: Detected lcore 23 as core 26 on socket 0 00:05:26.781 EAL: Detected lcore 24 as core 27 on socket 0 00:05:26.781 EAL: Detected lcore 25 as core 28 on socket 0 00:05:26.781 EAL: Detected lcore 26 as core 29 on socket 0 00:05:26.781 EAL: Detected lcore 27 as core 30 on socket 0 00:05:26.781 EAL: Detected lcore 28 as core 0 on socket 1 00:05:26.781 EAL: Detected lcore 29 as core 1 on socket 1 00:05:26.781 EAL: Detected lcore 30 as core 2 on socket 1 00:05:26.781 EAL: Detected lcore 31 as core 3 on socket 1 00:05:26.781 EAL: Detected lcore 32 as core 4 on socket 1 00:05:26.781 EAL: Detected lcore 33 as core 5 on socket 1 00:05:26.781 EAL: Detected lcore 34 as core 6 on socket 1 00:05:26.781 EAL: Detected lcore 35 as core 8 on socket 1 00:05:26.781 EAL: Detected lcore 36 as core 9 on socket 1 00:05:26.781 EAL: Detected lcore 37 as core 10 on socket 1 00:05:26.781 EAL: Detected lcore 38 as core 11 on socket 1 00:05:26.781 EAL: Detected lcore 39 as core 12 on socket 1 00:05:26.781 EAL: Detected lcore 40 as core 13 on socket 1 00:05:26.781 EAL: Detected lcore 41 as core 14 on socket 1 00:05:26.781 EAL: Detected lcore 42 as core 16 on socket 1 00:05:26.781 EAL: Detected lcore 43 as core 17 on socket 1 00:05:26.781 EAL: Detected lcore 44 as core 18 on socket 1 00:05:26.781 EAL: Detected lcore 45 as core 19 on socket 1 00:05:26.781 EAL: Detected lcore 46 as core 20 on socket 1 00:05:26.781 EAL: Detected lcore 47 as core 21 on socket 1 00:05:26.781 EAL: Detected lcore 48 as core 22 on socket 1 00:05:26.781 EAL: Detected lcore 49 as core 24 on socket 1 00:05:26.781 EAL: Detected lcore 50 as core 25 on socket 1 00:05:26.781 EAL: Detected lcore 51 as core 26 on socket 1 00:05:26.781 EAL: Detected lcore 52 as core 27 on socket 1 00:05:26.781 EAL: Detected lcore 53 as core 28 on socket 1 00:05:26.781 EAL: Detected lcore 54 as core 29 on socket 1 00:05:26.781 EAL: Detected lcore 55 as core 30 on socket 1 00:05:26.781 EAL: Detected lcore 56 as core 0 on socket 0 00:05:26.781 EAL: Detected lcore 57 as core 1 on socket 0 00:05:26.781 EAL: Detected lcore 58 as core 2 on socket 0 00:05:26.781 EAL: Detected lcore 59 as core 3 on socket 0 00:05:26.781 EAL: Detected lcore 60 as core 4 on socket 0 00:05:26.781 EAL: Detected lcore 61 as core 5 on socket 0 00:05:26.781 EAL: Detected lcore 62 as core 6 on socket 0 00:05:26.781 EAL: Detected lcore 63 as core 8 on socket 0 00:05:26.781 EAL: Detected lcore 64 as core 9 on socket 0 00:05:26.781 EAL: Detected lcore 65 as core 10 on socket 0 00:05:26.781 EAL: Detected lcore 66 as core 11 on socket 0 00:05:26.781 EAL: Detected lcore 67 as core 12 on socket 0 00:05:26.781 EAL: Detected lcore 68 as core 13 on socket 0 00:05:26.781 EAL: Detected lcore 69 as core 14 on socket 0 00:05:26.781 EAL: Detected lcore 70 as core 16 on socket 0 00:05:26.781 EAL: Detected lcore 71 as core 17 on socket 0 00:05:26.781 EAL: Detected lcore 72 as core 18 on socket 0 00:05:26.781 EAL: Detected lcore 73 as core 19 on socket 0 00:05:26.781 EAL: Detected lcore 74 as core 20 on socket 0 00:05:26.781 EAL: Detected lcore 75 as core 21 on socket 0 00:05:26.781 EAL: Detected lcore 76 as core 22 on socket 0 00:05:26.781 EAL: Detected lcore 77 as core 24 on socket 0 00:05:26.781 EAL: Detected lcore 78 as core 25 on socket 0 00:05:26.781 EAL: Detected lcore 79 as core 26 on socket 0 00:05:26.781 EAL: Detected lcore 80 as core 27 on socket 0 00:05:26.781 EAL: Detected lcore 81 as core 28 on socket 0 00:05:26.781 EAL: Detected lcore 82 as core 29 on socket 0 00:05:26.781 EAL: Detected lcore 83 as core 30 on socket 0 00:05:26.781 EAL: Detected lcore 84 as core 0 on socket 1 00:05:26.781 EAL: Detected lcore 85 as core 1 on socket 1 00:05:26.781 EAL: Detected lcore 86 as core 2 on socket 1 00:05:26.781 EAL: Detected lcore 87 as core 3 on socket 1 00:05:26.781 EAL: Detected lcore 88 as core 4 on socket 1 00:05:26.781 EAL: Detected lcore 89 as core 5 on socket 1 00:05:26.781 EAL: Detected lcore 90 as core 6 on socket 1 00:05:26.781 EAL: Detected lcore 91 as core 8 on socket 1 00:05:26.781 EAL: Detected lcore 92 as core 9 on socket 1 00:05:26.781 EAL: Detected lcore 93 as core 10 on socket 1 00:05:26.781 EAL: Detected lcore 94 as core 11 on socket 1 00:05:26.781 EAL: Detected lcore 95 as core 12 on socket 1 00:05:26.781 EAL: Detected lcore 96 as core 13 on socket 1 00:05:26.781 EAL: Detected lcore 97 as core 14 on socket 1 00:05:26.781 EAL: Detected lcore 98 as core 16 on socket 1 00:05:26.781 EAL: Detected lcore 99 as core 17 on socket 1 00:05:26.781 EAL: Detected lcore 100 as core 18 on socket 1 00:05:26.781 EAL: Detected lcore 101 as core 19 on socket 1 00:05:26.781 EAL: Detected lcore 102 as core 20 on socket 1 00:05:26.781 EAL: Detected lcore 103 as core 21 on socket 1 00:05:26.781 EAL: Detected lcore 104 as core 22 on socket 1 00:05:26.781 EAL: Detected lcore 105 as core 24 on socket 1 00:05:26.781 EAL: Detected lcore 106 as core 25 on socket 1 00:05:26.781 EAL: Detected lcore 107 as core 26 on socket 1 00:05:26.781 EAL: Detected lcore 108 as core 27 on socket 1 00:05:26.781 EAL: Detected lcore 109 as core 28 on socket 1 00:05:26.781 EAL: Detected lcore 110 as core 29 on socket 1 00:05:26.781 EAL: Detected lcore 111 as core 30 on socket 1 00:05:26.781 EAL: Maximum logical cores by configuration: 128 00:05:26.781 EAL: Detected CPU lcores: 112 00:05:26.781 EAL: Detected NUMA nodes: 2 00:05:26.781 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:26.781 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:26.781 EAL: Checking presence of .so 'librte_eal.so' 00:05:26.781 EAL: Detected static linkage of DPDK 00:05:26.781 EAL: No shared files mode enabled, IPC will be disabled 00:05:26.781 EAL: Bus pci wants IOVA as 'DC' 00:05:26.781 EAL: Buses did not request a specific IOVA mode. 00:05:26.781 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:26.781 EAL: Selected IOVA mode 'VA' 00:05:26.781 EAL: No free 2048 kB hugepages reported on node 1 00:05:26.781 EAL: Probing VFIO support... 00:05:26.781 EAL: IOMMU type 1 (Type 1) is supported 00:05:26.781 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:26.781 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:26.781 EAL: VFIO support initialized 00:05:26.781 EAL: Ask a virtual area of 0x2e000 bytes 00:05:26.781 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:26.781 EAL: Setting up physically contiguous memory... 00:05:26.781 EAL: Setting maximum number of open files to 524288 00:05:26.781 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:26.781 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:26.781 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:26.781 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.781 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:26.781 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:26.781 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.781 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:26.781 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:26.781 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.781 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:26.781 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:26.781 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.781 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:26.781 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:26.781 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.781 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:26.781 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:26.781 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.781 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:26.781 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:26.781 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.781 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:26.781 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:26.781 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.781 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:26.781 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:26.781 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:26.781 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.781 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:26.782 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:26.782 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.782 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:26.782 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:26.782 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.782 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:26.782 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:26.782 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.782 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:26.782 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:26.782 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.782 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:26.782 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:26.782 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.782 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:26.782 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:26.782 EAL: Ask a virtual area of 0x61000 bytes 00:05:26.782 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:26.782 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:26.782 EAL: Ask a virtual area of 0x400000000 bytes 00:05:26.782 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:26.782 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:26.782 EAL: Hugepages will be freed exactly as allocated. 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: TSC frequency is ~2500000 KHz 00:05:26.782 EAL: Main lcore 0 is ready (tid=7f350a589a00;cpuset=[0]) 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 0 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 2MB 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Mem event callback 'spdk:(nil)' registered 00:05:26.782 00:05:26.782 00:05:26.782 CUnit - A unit testing framework for C - Version 2.1-3 00:05:26.782 http://cunit.sourceforge.net/ 00:05:26.782 00:05:26.782 00:05:26.782 Suite: components_suite 00:05:26.782 Test: vtophys_malloc_test ...passed 00:05:26.782 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 4MB 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was shrunk by 4MB 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 6MB 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was shrunk by 6MB 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 10MB 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was shrunk by 10MB 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 18MB 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was shrunk by 18MB 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 34MB 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was shrunk by 34MB 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 66MB 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was shrunk by 66MB 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 130MB 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was shrunk by 130MB 00:05:26.782 EAL: Trying to obtain current memory policy. 00:05:26.782 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:26.782 EAL: Restoring previous memory policy: 4 00:05:26.782 EAL: Calling mem event callback 'spdk:(nil)' 00:05:26.782 EAL: request: mp_malloc_sync 00:05:26.782 EAL: No shared files mode enabled, IPC is disabled 00:05:26.782 EAL: Heap on socket 0 was expanded by 258MB 00:05:27.042 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.042 EAL: request: mp_malloc_sync 00:05:27.042 EAL: No shared files mode enabled, IPC is disabled 00:05:27.042 EAL: Heap on socket 0 was shrunk by 258MB 00:05:27.042 EAL: Trying to obtain current memory policy. 00:05:27.042 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.042 EAL: Restoring previous memory policy: 4 00:05:27.042 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.042 EAL: request: mp_malloc_sync 00:05:27.042 EAL: No shared files mode enabled, IPC is disabled 00:05:27.042 EAL: Heap on socket 0 was expanded by 514MB 00:05:27.042 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.301 EAL: request: mp_malloc_sync 00:05:27.301 EAL: No shared files mode enabled, IPC is disabled 00:05:27.301 EAL: Heap on socket 0 was shrunk by 514MB 00:05:27.301 EAL: Trying to obtain current memory policy. 00:05:27.301 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.301 EAL: Restoring previous memory policy: 4 00:05:27.301 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.301 EAL: request: mp_malloc_sync 00:05:27.301 EAL: No shared files mode enabled, IPC is disabled 00:05:27.301 EAL: Heap on socket 0 was expanded by 1026MB 00:05:27.561 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.821 EAL: request: mp_malloc_sync 00:05:27.821 EAL: No shared files mode enabled, IPC is disabled 00:05:27.821 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:27.821 passed 00:05:27.821 00:05:27.821 Run Summary: Type Total Ran Passed Failed Inactive 00:05:27.821 suites 1 1 n/a 0 0 00:05:27.821 tests 2 2 2 0 0 00:05:27.821 asserts 497 497 497 0 n/a 00:05:27.821 00:05:27.821 Elapsed time = 0.964 seconds 00:05:27.821 EAL: Calling mem event callback 'spdk:(nil)' 00:05:27.821 EAL: request: mp_malloc_sync 00:05:27.821 EAL: No shared files mode enabled, IPC is disabled 00:05:27.821 EAL: Heap on socket 0 was shrunk by 2MB 00:05:27.821 EAL: No shared files mode enabled, IPC is disabled 00:05:27.821 EAL: No shared files mode enabled, IPC is disabled 00:05:27.821 EAL: No shared files mode enabled, IPC is disabled 00:05:27.821 00:05:27.821 real 0m1.095s 00:05:27.821 user 0m0.633s 00:05:27.821 sys 0m0.423s 00:05:27.821 19:03:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:27.821 19:03:46 -- common/autotest_common.sh@10 -- # set +x 00:05:27.821 ************************************ 00:05:27.821 END TEST env_vtophys 00:05:27.821 ************************************ 00:05:27.821 19:03:46 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:27.821 19:03:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:27.821 19:03:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.821 19:03:46 -- common/autotest_common.sh@10 -- # set +x 00:05:27.821 ************************************ 00:05:27.821 START TEST env_pci 00:05:27.821 ************************************ 00:05:27.821 19:03:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:27.821 00:05:27.821 00:05:27.821 CUnit - A unit testing framework for C - Version 2.1-3 00:05:27.821 http://cunit.sourceforge.net/ 00:05:27.821 00:05:27.821 00:05:27.821 Suite: pci 00:05:27.821 Test: pci_hook ...[2024-11-18 19:03:46.275507] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1274441 has claimed it 00:05:27.821 EAL: Cannot find device (10000:00:01.0) 00:05:27.821 EAL: Failed to attach device on primary process 00:05:27.821 passed 00:05:27.822 00:05:27.822 Run Summary: Type Total Ran Passed Failed Inactive 00:05:27.822 suites 1 1 n/a 0 0 00:05:27.822 tests 1 1 1 0 0 00:05:27.822 asserts 25 25 25 0 n/a 00:05:27.822 00:05:27.822 Elapsed time = 0.037 seconds 00:05:27.822 00:05:27.822 real 0m0.057s 00:05:27.822 user 0m0.012s 00:05:27.822 sys 0m0.044s 00:05:27.822 19:03:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:27.822 19:03:46 -- common/autotest_common.sh@10 -- # set +x 00:05:27.822 ************************************ 00:05:27.822 END TEST env_pci 00:05:27.822 ************************************ 00:05:27.822 19:03:46 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:27.822 19:03:46 -- env/env.sh@15 -- # uname 00:05:27.822 19:03:46 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:27.822 19:03:46 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:27.822 19:03:46 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:27.822 19:03:46 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:27.822 19:03:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.822 19:03:46 -- common/autotest_common.sh@10 -- # set +x 00:05:27.822 ************************************ 00:05:27.822 START TEST env_dpdk_post_init 00:05:27.822 ************************************ 00:05:27.822 19:03:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:27.822 EAL: Detected CPU lcores: 112 00:05:27.822 EAL: Detected NUMA nodes: 2 00:05:27.822 EAL: Detected static linkage of DPDK 00:05:27.822 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:28.081 EAL: Selected IOVA mode 'VA' 00:05:28.082 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.082 EAL: VFIO support initialized 00:05:28.082 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:28.082 EAL: Using IOMMU type 1 (Type 1) 00:05:28.651 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:32.848 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:32.848 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:32.848 Starting DPDK initialization... 00:05:32.848 Starting SPDK post initialization... 00:05:32.848 SPDK NVMe probe 00:05:32.848 Attaching to 0000:d8:00.0 00:05:32.848 Attached to 0000:d8:00.0 00:05:32.848 Cleaning up... 00:05:32.848 00:05:32.848 real 0m4.661s 00:05:32.848 user 0m3.517s 00:05:32.848 sys 0m0.386s 00:05:32.848 19:03:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.848 19:03:51 -- common/autotest_common.sh@10 -- # set +x 00:05:32.848 ************************************ 00:05:32.848 END TEST env_dpdk_post_init 00:05:32.848 ************************************ 00:05:32.848 19:03:51 -- env/env.sh@26 -- # uname 00:05:32.848 19:03:51 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:32.848 19:03:51 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:32.848 19:03:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.848 19:03:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.848 19:03:51 -- common/autotest_common.sh@10 -- # set +x 00:05:32.848 ************************************ 00:05:32.848 START TEST env_mem_callbacks 00:05:32.848 ************************************ 00:05:32.848 19:03:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:32.848 EAL: Detected CPU lcores: 112 00:05:32.848 EAL: Detected NUMA nodes: 2 00:05:32.848 EAL: Detected static linkage of DPDK 00:05:32.848 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:32.848 EAL: Selected IOVA mode 'VA' 00:05:32.848 EAL: No free 2048 kB hugepages reported on node 1 00:05:32.848 EAL: VFIO support initialized 00:05:32.848 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:32.848 00:05:32.848 00:05:32.848 CUnit - A unit testing framework for C - Version 2.1-3 00:05:32.848 http://cunit.sourceforge.net/ 00:05:32.848 00:05:32.848 00:05:32.848 Suite: memory 00:05:32.848 Test: test ... 00:05:32.848 register 0x200000200000 2097152 00:05:32.848 malloc 3145728 00:05:32.848 register 0x200000400000 4194304 00:05:32.848 buf 0x200000500000 len 3145728 PASSED 00:05:32.848 malloc 64 00:05:32.848 buf 0x2000004fff40 len 64 PASSED 00:05:32.848 malloc 4194304 00:05:32.848 register 0x200000800000 6291456 00:05:32.848 buf 0x200000a00000 len 4194304 PASSED 00:05:32.848 free 0x200000500000 3145728 00:05:32.848 free 0x2000004fff40 64 00:05:32.848 unregister 0x200000400000 4194304 PASSED 00:05:32.848 free 0x200000a00000 4194304 00:05:32.848 unregister 0x200000800000 6291456 PASSED 00:05:32.848 malloc 8388608 00:05:32.848 register 0x200000400000 10485760 00:05:32.848 buf 0x200000600000 len 8388608 PASSED 00:05:32.848 free 0x200000600000 8388608 00:05:32.848 unregister 0x200000400000 10485760 PASSED 00:05:32.848 passed 00:05:32.848 00:05:32.848 Run Summary: Type Total Ran Passed Failed Inactive 00:05:32.848 suites 1 1 n/a 0 0 00:05:32.848 tests 1 1 1 0 0 00:05:32.848 asserts 15 15 15 0 n/a 00:05:32.848 00:05:32.848 Elapsed time = 0.005 seconds 00:05:32.848 00:05:32.848 real 0m0.066s 00:05:32.848 user 0m0.021s 00:05:32.848 sys 0m0.045s 00:05:32.848 19:03:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.848 19:03:51 -- common/autotest_common.sh@10 -- # set +x 00:05:32.848 ************************************ 00:05:32.848 END TEST env_mem_callbacks 00:05:32.848 ************************************ 00:05:32.848 00:05:32.848 real 0m6.428s 00:05:32.848 user 0m4.456s 00:05:32.848 sys 0m1.234s 00:05:32.848 19:03:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.848 19:03:51 -- common/autotest_common.sh@10 -- # set +x 00:05:32.848 ************************************ 00:05:32.848 END TEST env 00:05:32.848 ************************************ 00:05:32.848 19:03:51 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:32.848 19:03:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.848 19:03:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.848 19:03:51 -- common/autotest_common.sh@10 -- # set +x 00:05:32.848 ************************************ 00:05:32.848 START TEST rpc 00:05:32.848 ************************************ 00:05:32.848 19:03:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:32.848 * Looking for test storage... 00:05:32.848 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:32.848 19:03:51 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:32.848 19:03:51 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:32.848 19:03:51 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:32.848 19:03:51 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:32.848 19:03:51 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:32.848 19:03:51 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:32.848 19:03:51 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:32.848 19:03:51 -- scripts/common.sh@335 -- # IFS=.-: 00:05:32.848 19:03:51 -- scripts/common.sh@335 -- # read -ra ver1 00:05:32.848 19:03:51 -- scripts/common.sh@336 -- # IFS=.-: 00:05:32.848 19:03:51 -- scripts/common.sh@336 -- # read -ra ver2 00:05:32.848 19:03:51 -- scripts/common.sh@337 -- # local 'op=<' 00:05:32.848 19:03:51 -- scripts/common.sh@339 -- # ver1_l=2 00:05:32.848 19:03:51 -- scripts/common.sh@340 -- # ver2_l=1 00:05:32.848 19:03:51 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:32.848 19:03:51 -- scripts/common.sh@343 -- # case "$op" in 00:05:32.848 19:03:51 -- scripts/common.sh@344 -- # : 1 00:05:32.848 19:03:51 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:32.848 19:03:51 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:32.848 19:03:51 -- scripts/common.sh@364 -- # decimal 1 00:05:32.848 19:03:51 -- scripts/common.sh@352 -- # local d=1 00:05:32.848 19:03:51 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:32.848 19:03:51 -- scripts/common.sh@354 -- # echo 1 00:05:32.848 19:03:51 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:32.848 19:03:51 -- scripts/common.sh@365 -- # decimal 2 00:05:32.848 19:03:51 -- scripts/common.sh@352 -- # local d=2 00:05:32.848 19:03:51 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:32.848 19:03:51 -- scripts/common.sh@354 -- # echo 2 00:05:32.848 19:03:51 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:32.848 19:03:51 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:32.848 19:03:51 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:32.848 19:03:51 -- scripts/common.sh@367 -- # return 0 00:05:32.848 19:03:51 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:32.848 19:03:51 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:32.848 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.849 --rc genhtml_branch_coverage=1 00:05:32.849 --rc genhtml_function_coverage=1 00:05:32.849 --rc genhtml_legend=1 00:05:32.849 --rc geninfo_all_blocks=1 00:05:32.849 --rc geninfo_unexecuted_blocks=1 00:05:32.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.849 ' 00:05:32.849 19:03:51 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:32.849 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.849 --rc genhtml_branch_coverage=1 00:05:32.849 --rc genhtml_function_coverage=1 00:05:32.849 --rc genhtml_legend=1 00:05:32.849 --rc geninfo_all_blocks=1 00:05:32.849 --rc geninfo_unexecuted_blocks=1 00:05:32.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.849 ' 00:05:32.849 19:03:51 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:32.849 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.849 --rc genhtml_branch_coverage=1 00:05:32.849 --rc genhtml_function_coverage=1 00:05:32.849 --rc genhtml_legend=1 00:05:32.849 --rc geninfo_all_blocks=1 00:05:32.849 --rc geninfo_unexecuted_blocks=1 00:05:32.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.849 ' 00:05:32.849 19:03:51 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:32.849 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.849 --rc genhtml_branch_coverage=1 00:05:32.849 --rc genhtml_function_coverage=1 00:05:32.849 --rc genhtml_legend=1 00:05:32.849 --rc geninfo_all_blocks=1 00:05:32.849 --rc geninfo_unexecuted_blocks=1 00:05:32.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.849 ' 00:05:32.849 19:03:51 -- rpc/rpc.sh@65 -- # spdk_pid=1275462 00:05:32.849 19:03:51 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:32.849 19:03:51 -- rpc/rpc.sh@67 -- # waitforlisten 1275462 00:05:32.849 19:03:51 -- common/autotest_common.sh@829 -- # '[' -z 1275462 ']' 00:05:32.849 19:03:51 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:32.849 19:03:51 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:32.849 19:03:51 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:32.849 19:03:51 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:32.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:32.849 19:03:51 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:32.849 19:03:51 -- common/autotest_common.sh@10 -- # set +x 00:05:32.849 [2024-11-18 19:03:51.436124] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:32.849 [2024-11-18 19:03:51.436210] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1275462 ] 00:05:33.108 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.108 [2024-11-18 19:03:51.504300] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.108 [2024-11-18 19:03:51.578974] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:33.108 [2024-11-18 19:03:51.579070] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:33.108 [2024-11-18 19:03:51.579080] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1275462' to capture a snapshot of events at runtime. 00:05:33.108 [2024-11-18 19:03:51.579089] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1275462 for offline analysis/debug. 00:05:33.108 [2024-11-18 19:03:51.579107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.683 19:03:52 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:33.683 19:03:52 -- common/autotest_common.sh@862 -- # return 0 00:05:33.683 19:03:52 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:33.683 19:03:52 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:33.683 19:03:52 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:33.683 19:03:52 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:33.683 19:03:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:33.683 19:03:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.683 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:33.683 ************************************ 00:05:33.683 START TEST rpc_integrity 00:05:33.683 ************************************ 00:05:33.683 19:03:52 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:33.683 19:03:52 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:33.683 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:33.683 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:33.683 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:33.683 19:03:52 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:33.683 19:03:52 -- rpc/rpc.sh@13 -- # jq length 00:05:34.007 19:03:52 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:34.007 19:03:52 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:34.007 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.007 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.007 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.007 19:03:52 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:34.007 19:03:52 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:34.007 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.007 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.007 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.007 19:03:52 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:34.007 { 00:05:34.007 "name": "Malloc0", 00:05:34.007 "aliases": [ 00:05:34.007 "5d74b12e-096b-4e69-b1b0-989bb500eecc" 00:05:34.007 ], 00:05:34.007 "product_name": "Malloc disk", 00:05:34.007 "block_size": 512, 00:05:34.007 "num_blocks": 16384, 00:05:34.007 "uuid": "5d74b12e-096b-4e69-b1b0-989bb500eecc", 00:05:34.007 "assigned_rate_limits": { 00:05:34.007 "rw_ios_per_sec": 0, 00:05:34.007 "rw_mbytes_per_sec": 0, 00:05:34.007 "r_mbytes_per_sec": 0, 00:05:34.007 "w_mbytes_per_sec": 0 00:05:34.007 }, 00:05:34.007 "claimed": false, 00:05:34.007 "zoned": false, 00:05:34.007 "supported_io_types": { 00:05:34.007 "read": true, 00:05:34.007 "write": true, 00:05:34.007 "unmap": true, 00:05:34.007 "write_zeroes": true, 00:05:34.007 "flush": true, 00:05:34.007 "reset": true, 00:05:34.007 "compare": false, 00:05:34.007 "compare_and_write": false, 00:05:34.007 "abort": true, 00:05:34.007 "nvme_admin": false, 00:05:34.007 "nvme_io": false 00:05:34.007 }, 00:05:34.007 "memory_domains": [ 00:05:34.007 { 00:05:34.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.007 "dma_device_type": 2 00:05:34.007 } 00:05:34.007 ], 00:05:34.007 "driver_specific": {} 00:05:34.007 } 00:05:34.007 ]' 00:05:34.007 19:03:52 -- rpc/rpc.sh@17 -- # jq length 00:05:34.007 19:03:52 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:34.007 19:03:52 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:34.007 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.007 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.007 [2024-11-18 19:03:52.375450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:34.007 [2024-11-18 19:03:52.375482] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:34.007 [2024-11-18 19:03:52.375503] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5a4e030 00:05:34.007 [2024-11-18 19:03:52.375513] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:34.007 [2024-11-18 19:03:52.376331] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:34.007 [2024-11-18 19:03:52.376353] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:34.007 Passthru0 00:05:34.007 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.007 19:03:52 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:34.007 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.007 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.007 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.007 19:03:52 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:34.007 { 00:05:34.007 "name": "Malloc0", 00:05:34.007 "aliases": [ 00:05:34.007 "5d74b12e-096b-4e69-b1b0-989bb500eecc" 00:05:34.007 ], 00:05:34.007 "product_name": "Malloc disk", 00:05:34.007 "block_size": 512, 00:05:34.007 "num_blocks": 16384, 00:05:34.007 "uuid": "5d74b12e-096b-4e69-b1b0-989bb500eecc", 00:05:34.007 "assigned_rate_limits": { 00:05:34.007 "rw_ios_per_sec": 0, 00:05:34.007 "rw_mbytes_per_sec": 0, 00:05:34.007 "r_mbytes_per_sec": 0, 00:05:34.007 "w_mbytes_per_sec": 0 00:05:34.007 }, 00:05:34.007 "claimed": true, 00:05:34.007 "claim_type": "exclusive_write", 00:05:34.007 "zoned": false, 00:05:34.007 "supported_io_types": { 00:05:34.008 "read": true, 00:05:34.008 "write": true, 00:05:34.008 "unmap": true, 00:05:34.008 "write_zeroes": true, 00:05:34.008 "flush": true, 00:05:34.008 "reset": true, 00:05:34.008 "compare": false, 00:05:34.008 "compare_and_write": false, 00:05:34.008 "abort": true, 00:05:34.008 "nvme_admin": false, 00:05:34.008 "nvme_io": false 00:05:34.008 }, 00:05:34.008 "memory_domains": [ 00:05:34.008 { 00:05:34.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.008 "dma_device_type": 2 00:05:34.008 } 00:05:34.008 ], 00:05:34.008 "driver_specific": {} 00:05:34.008 }, 00:05:34.008 { 00:05:34.008 "name": "Passthru0", 00:05:34.008 "aliases": [ 00:05:34.008 "08d4db67-69c4-5b71-bea2-29c21dfd135d" 00:05:34.008 ], 00:05:34.008 "product_name": "passthru", 00:05:34.008 "block_size": 512, 00:05:34.008 "num_blocks": 16384, 00:05:34.008 "uuid": "08d4db67-69c4-5b71-bea2-29c21dfd135d", 00:05:34.008 "assigned_rate_limits": { 00:05:34.008 "rw_ios_per_sec": 0, 00:05:34.008 "rw_mbytes_per_sec": 0, 00:05:34.008 "r_mbytes_per_sec": 0, 00:05:34.008 "w_mbytes_per_sec": 0 00:05:34.008 }, 00:05:34.008 "claimed": false, 00:05:34.008 "zoned": false, 00:05:34.008 "supported_io_types": { 00:05:34.008 "read": true, 00:05:34.008 "write": true, 00:05:34.008 "unmap": true, 00:05:34.008 "write_zeroes": true, 00:05:34.008 "flush": true, 00:05:34.008 "reset": true, 00:05:34.008 "compare": false, 00:05:34.008 "compare_and_write": false, 00:05:34.008 "abort": true, 00:05:34.008 "nvme_admin": false, 00:05:34.008 "nvme_io": false 00:05:34.008 }, 00:05:34.008 "memory_domains": [ 00:05:34.008 { 00:05:34.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.008 "dma_device_type": 2 00:05:34.008 } 00:05:34.008 ], 00:05:34.008 "driver_specific": { 00:05:34.008 "passthru": { 00:05:34.008 "name": "Passthru0", 00:05:34.008 "base_bdev_name": "Malloc0" 00:05:34.008 } 00:05:34.008 } 00:05:34.008 } 00:05:34.008 ]' 00:05:34.008 19:03:52 -- rpc/rpc.sh@21 -- # jq length 00:05:34.008 19:03:52 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:34.008 19:03:52 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:34.008 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.008 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.008 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.008 19:03:52 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:34.008 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.008 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.008 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.008 19:03:52 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:34.008 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.008 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.008 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.008 19:03:52 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:34.008 19:03:52 -- rpc/rpc.sh@26 -- # jq length 00:05:34.008 19:03:52 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:34.008 00:05:34.008 real 0m0.244s 00:05:34.008 user 0m0.142s 00:05:34.008 sys 0m0.036s 00:05:34.008 19:03:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.008 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.008 ************************************ 00:05:34.008 END TEST rpc_integrity 00:05:34.008 ************************************ 00:05:34.008 19:03:52 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:34.008 19:03:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:34.008 19:03:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.008 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.008 ************************************ 00:05:34.008 START TEST rpc_plugins 00:05:34.008 ************************************ 00:05:34.008 19:03:52 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:34.008 19:03:52 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:34.008 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.008 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.008 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.008 19:03:52 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:34.008 19:03:52 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:34.008 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.008 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.008 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.008 19:03:52 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:34.008 { 00:05:34.008 "name": "Malloc1", 00:05:34.008 "aliases": [ 00:05:34.008 "0cbdf607-e0ed-4272-84e7-a7cba2f1390b" 00:05:34.008 ], 00:05:34.008 "product_name": "Malloc disk", 00:05:34.008 "block_size": 4096, 00:05:34.008 "num_blocks": 256, 00:05:34.008 "uuid": "0cbdf607-e0ed-4272-84e7-a7cba2f1390b", 00:05:34.008 "assigned_rate_limits": { 00:05:34.008 "rw_ios_per_sec": 0, 00:05:34.008 "rw_mbytes_per_sec": 0, 00:05:34.008 "r_mbytes_per_sec": 0, 00:05:34.008 "w_mbytes_per_sec": 0 00:05:34.008 }, 00:05:34.008 "claimed": false, 00:05:34.008 "zoned": false, 00:05:34.008 "supported_io_types": { 00:05:34.008 "read": true, 00:05:34.008 "write": true, 00:05:34.008 "unmap": true, 00:05:34.008 "write_zeroes": true, 00:05:34.008 "flush": true, 00:05:34.008 "reset": true, 00:05:34.008 "compare": false, 00:05:34.008 "compare_and_write": false, 00:05:34.008 "abort": true, 00:05:34.008 "nvme_admin": false, 00:05:34.008 "nvme_io": false 00:05:34.008 }, 00:05:34.008 "memory_domains": [ 00:05:34.008 { 00:05:34.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.008 "dma_device_type": 2 00:05:34.008 } 00:05:34.008 ], 00:05:34.008 "driver_specific": {} 00:05:34.008 } 00:05:34.008 ]' 00:05:34.008 19:03:52 -- rpc/rpc.sh@32 -- # jq length 00:05:34.373 19:03:52 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:34.373 19:03:52 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:34.373 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.373 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.373 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.373 19:03:52 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:34.373 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.373 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.373 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.373 19:03:52 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:34.373 19:03:52 -- rpc/rpc.sh@36 -- # jq length 00:05:34.373 19:03:52 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:34.373 00:05:34.373 real 0m0.137s 00:05:34.373 user 0m0.086s 00:05:34.373 sys 0m0.016s 00:05:34.373 19:03:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.373 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.373 ************************************ 00:05:34.373 END TEST rpc_plugins 00:05:34.373 ************************************ 00:05:34.373 19:03:52 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:34.373 19:03:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:34.373 19:03:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.373 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.373 ************************************ 00:05:34.373 START TEST rpc_trace_cmd_test 00:05:34.373 ************************************ 00:05:34.373 19:03:52 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:34.373 19:03:52 -- rpc/rpc.sh@40 -- # local info 00:05:34.374 19:03:52 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:34.374 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.374 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.374 19:03:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.374 19:03:52 -- rpc/rpc.sh@42 -- # info='{ 00:05:34.374 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1275462", 00:05:34.374 "tpoint_group_mask": "0x8", 00:05:34.374 "iscsi_conn": { 00:05:34.374 "mask": "0x2", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "scsi": { 00:05:34.374 "mask": "0x4", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "bdev": { 00:05:34.374 "mask": "0x8", 00:05:34.374 "tpoint_mask": "0xffffffffffffffff" 00:05:34.374 }, 00:05:34.374 "nvmf_rdma": { 00:05:34.374 "mask": "0x10", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "nvmf_tcp": { 00:05:34.374 "mask": "0x20", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "ftl": { 00:05:34.374 "mask": "0x40", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "blobfs": { 00:05:34.374 "mask": "0x80", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "dsa": { 00:05:34.374 "mask": "0x200", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "thread": { 00:05:34.374 "mask": "0x400", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "nvme_pcie": { 00:05:34.374 "mask": "0x800", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "iaa": { 00:05:34.374 "mask": "0x1000", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "nvme_tcp": { 00:05:34.374 "mask": "0x2000", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 }, 00:05:34.374 "bdev_nvme": { 00:05:34.374 "mask": "0x4000", 00:05:34.374 "tpoint_mask": "0x0" 00:05:34.374 } 00:05:34.374 }' 00:05:34.374 19:03:52 -- rpc/rpc.sh@43 -- # jq length 00:05:34.374 19:03:52 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:34.374 19:03:52 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:34.374 19:03:52 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:34.374 19:03:52 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:34.374 19:03:52 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:34.374 19:03:52 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:34.374 19:03:52 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:34.374 19:03:52 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:34.374 19:03:52 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:34.374 00:05:34.374 real 0m0.221s 00:05:34.374 user 0m0.179s 00:05:34.374 sys 0m0.035s 00:05:34.374 19:03:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.374 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.374 ************************************ 00:05:34.374 END TEST rpc_trace_cmd_test 00:05:34.374 ************************************ 00:05:34.633 19:03:52 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:34.633 19:03:52 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:34.633 19:03:52 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:34.633 19:03:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:34.633 19:03:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.633 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.633 ************************************ 00:05:34.633 START TEST rpc_daemon_integrity 00:05:34.633 ************************************ 00:05:34.634 19:03:52 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:34.634 19:03:52 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:34.634 19:03:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:52 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:34.634 19:03:53 -- rpc/rpc.sh@13 -- # jq length 00:05:34.634 19:03:53 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:34.634 19:03:53 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:34.634 19:03:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:34.634 19:03:53 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:34.634 19:03:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:34.634 { 00:05:34.634 "name": "Malloc2", 00:05:34.634 "aliases": [ 00:05:34.634 "3ee0a955-0664-449d-aeae-9e2c2f7c83f8" 00:05:34.634 ], 00:05:34.634 "product_name": "Malloc disk", 00:05:34.634 "block_size": 512, 00:05:34.634 "num_blocks": 16384, 00:05:34.634 "uuid": "3ee0a955-0664-449d-aeae-9e2c2f7c83f8", 00:05:34.634 "assigned_rate_limits": { 00:05:34.634 "rw_ios_per_sec": 0, 00:05:34.634 "rw_mbytes_per_sec": 0, 00:05:34.634 "r_mbytes_per_sec": 0, 00:05:34.634 "w_mbytes_per_sec": 0 00:05:34.634 }, 00:05:34.634 "claimed": false, 00:05:34.634 "zoned": false, 00:05:34.634 "supported_io_types": { 00:05:34.634 "read": true, 00:05:34.634 "write": true, 00:05:34.634 "unmap": true, 00:05:34.634 "write_zeroes": true, 00:05:34.634 "flush": true, 00:05:34.634 "reset": true, 00:05:34.634 "compare": false, 00:05:34.634 "compare_and_write": false, 00:05:34.634 "abort": true, 00:05:34.634 "nvme_admin": false, 00:05:34.634 "nvme_io": false 00:05:34.634 }, 00:05:34.634 "memory_domains": [ 00:05:34.634 { 00:05:34.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.634 "dma_device_type": 2 00:05:34.634 } 00:05:34.634 ], 00:05:34.634 "driver_specific": {} 00:05:34.634 } 00:05:34.634 ]' 00:05:34.634 19:03:53 -- rpc/rpc.sh@17 -- # jq length 00:05:34.634 19:03:53 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:34.634 19:03:53 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:34.634 19:03:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 [2024-11-18 19:03:53.125408] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:34.634 [2024-11-18 19:03:53.125435] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:34.634 [2024-11-18 19:03:53.125451] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5bd7980 00:05:34.634 [2024-11-18 19:03:53.125460] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:34.634 [2024-11-18 19:03:53.126159] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:34.634 [2024-11-18 19:03:53.126179] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:34.634 Passthru0 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:34.634 19:03:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:34.634 { 00:05:34.634 "name": "Malloc2", 00:05:34.634 "aliases": [ 00:05:34.634 "3ee0a955-0664-449d-aeae-9e2c2f7c83f8" 00:05:34.634 ], 00:05:34.634 "product_name": "Malloc disk", 00:05:34.634 "block_size": 512, 00:05:34.634 "num_blocks": 16384, 00:05:34.634 "uuid": "3ee0a955-0664-449d-aeae-9e2c2f7c83f8", 00:05:34.634 "assigned_rate_limits": { 00:05:34.634 "rw_ios_per_sec": 0, 00:05:34.634 "rw_mbytes_per_sec": 0, 00:05:34.634 "r_mbytes_per_sec": 0, 00:05:34.634 "w_mbytes_per_sec": 0 00:05:34.634 }, 00:05:34.634 "claimed": true, 00:05:34.634 "claim_type": "exclusive_write", 00:05:34.634 "zoned": false, 00:05:34.634 "supported_io_types": { 00:05:34.634 "read": true, 00:05:34.634 "write": true, 00:05:34.634 "unmap": true, 00:05:34.634 "write_zeroes": true, 00:05:34.634 "flush": true, 00:05:34.634 "reset": true, 00:05:34.634 "compare": false, 00:05:34.634 "compare_and_write": false, 00:05:34.634 "abort": true, 00:05:34.634 "nvme_admin": false, 00:05:34.634 "nvme_io": false 00:05:34.634 }, 00:05:34.634 "memory_domains": [ 00:05:34.634 { 00:05:34.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.634 "dma_device_type": 2 00:05:34.634 } 00:05:34.634 ], 00:05:34.634 "driver_specific": {} 00:05:34.634 }, 00:05:34.634 { 00:05:34.634 "name": "Passthru0", 00:05:34.634 "aliases": [ 00:05:34.634 "bf268dc1-dcc4-5527-a02d-dfeedacf7e9d" 00:05:34.634 ], 00:05:34.634 "product_name": "passthru", 00:05:34.634 "block_size": 512, 00:05:34.634 "num_blocks": 16384, 00:05:34.634 "uuid": "bf268dc1-dcc4-5527-a02d-dfeedacf7e9d", 00:05:34.634 "assigned_rate_limits": { 00:05:34.634 "rw_ios_per_sec": 0, 00:05:34.634 "rw_mbytes_per_sec": 0, 00:05:34.634 "r_mbytes_per_sec": 0, 00:05:34.634 "w_mbytes_per_sec": 0 00:05:34.634 }, 00:05:34.634 "claimed": false, 00:05:34.634 "zoned": false, 00:05:34.634 "supported_io_types": { 00:05:34.634 "read": true, 00:05:34.634 "write": true, 00:05:34.634 "unmap": true, 00:05:34.634 "write_zeroes": true, 00:05:34.634 "flush": true, 00:05:34.634 "reset": true, 00:05:34.634 "compare": false, 00:05:34.634 "compare_and_write": false, 00:05:34.634 "abort": true, 00:05:34.634 "nvme_admin": false, 00:05:34.634 "nvme_io": false 00:05:34.634 }, 00:05:34.634 "memory_domains": [ 00:05:34.634 { 00:05:34.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:34.634 "dma_device_type": 2 00:05:34.634 } 00:05:34.634 ], 00:05:34.634 "driver_specific": { 00:05:34.634 "passthru": { 00:05:34.634 "name": "Passthru0", 00:05:34.634 "base_bdev_name": "Malloc2" 00:05:34.634 } 00:05:34.634 } 00:05:34.634 } 00:05:34.634 ]' 00:05:34.634 19:03:53 -- rpc/rpc.sh@21 -- # jq length 00:05:34.634 19:03:53 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:34.634 19:03:53 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:34.634 19:03:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:34.634 19:03:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:34.634 19:03:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.634 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.634 19:03:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.634 19:03:53 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:34.634 19:03:53 -- rpc/rpc.sh@26 -- # jq length 00:05:34.894 19:03:53 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:34.894 00:05:34.894 real 0m0.250s 00:05:34.894 user 0m0.159s 00:05:34.894 sys 0m0.030s 00:05:34.894 19:03:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.894 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:34.894 ************************************ 00:05:34.894 END TEST rpc_daemon_integrity 00:05:34.894 ************************************ 00:05:34.894 19:03:53 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:34.894 19:03:53 -- rpc/rpc.sh@84 -- # killprocess 1275462 00:05:34.894 19:03:53 -- common/autotest_common.sh@936 -- # '[' -z 1275462 ']' 00:05:34.894 19:03:53 -- common/autotest_common.sh@940 -- # kill -0 1275462 00:05:34.894 19:03:53 -- common/autotest_common.sh@941 -- # uname 00:05:34.894 19:03:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:34.894 19:03:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1275462 00:05:34.894 19:03:53 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:34.894 19:03:53 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:34.894 19:03:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1275462' 00:05:34.894 killing process with pid 1275462 00:05:34.894 19:03:53 -- common/autotest_common.sh@955 -- # kill 1275462 00:05:34.894 19:03:53 -- common/autotest_common.sh@960 -- # wait 1275462 00:05:35.154 00:05:35.154 real 0m2.425s 00:05:35.154 user 0m3.018s 00:05:35.154 sys 0m0.687s 00:05:35.154 19:03:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.154 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:35.154 ************************************ 00:05:35.154 END TEST rpc 00:05:35.154 ************************************ 00:05:35.154 19:03:53 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:35.154 19:03:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:35.154 19:03:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.154 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:35.154 ************************************ 00:05:35.154 START TEST rpc_client 00:05:35.154 ************************************ 00:05:35.154 19:03:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:35.414 * Looking for test storage... 00:05:35.414 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:35.414 19:03:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:35.414 19:03:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:35.414 19:03:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:35.414 19:03:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:35.414 19:03:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:35.414 19:03:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:35.414 19:03:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:35.414 19:03:53 -- scripts/common.sh@335 -- # IFS=.-: 00:05:35.414 19:03:53 -- scripts/common.sh@335 -- # read -ra ver1 00:05:35.414 19:03:53 -- scripts/common.sh@336 -- # IFS=.-: 00:05:35.414 19:03:53 -- scripts/common.sh@336 -- # read -ra ver2 00:05:35.414 19:03:53 -- scripts/common.sh@337 -- # local 'op=<' 00:05:35.414 19:03:53 -- scripts/common.sh@339 -- # ver1_l=2 00:05:35.414 19:03:53 -- scripts/common.sh@340 -- # ver2_l=1 00:05:35.414 19:03:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:35.414 19:03:53 -- scripts/common.sh@343 -- # case "$op" in 00:05:35.414 19:03:53 -- scripts/common.sh@344 -- # : 1 00:05:35.414 19:03:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:35.414 19:03:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:35.414 19:03:53 -- scripts/common.sh@364 -- # decimal 1 00:05:35.414 19:03:53 -- scripts/common.sh@352 -- # local d=1 00:05:35.414 19:03:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:35.414 19:03:53 -- scripts/common.sh@354 -- # echo 1 00:05:35.414 19:03:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:35.414 19:03:53 -- scripts/common.sh@365 -- # decimal 2 00:05:35.414 19:03:53 -- scripts/common.sh@352 -- # local d=2 00:05:35.414 19:03:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:35.414 19:03:53 -- scripts/common.sh@354 -- # echo 2 00:05:35.414 19:03:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:35.414 19:03:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:35.414 19:03:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:35.414 19:03:53 -- scripts/common.sh@367 -- # return 0 00:05:35.414 19:03:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:35.414 19:03:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:35.414 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.414 --rc genhtml_branch_coverage=1 00:05:35.414 --rc genhtml_function_coverage=1 00:05:35.414 --rc genhtml_legend=1 00:05:35.414 --rc geninfo_all_blocks=1 00:05:35.414 --rc geninfo_unexecuted_blocks=1 00:05:35.414 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.414 ' 00:05:35.414 19:03:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:35.414 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.414 --rc genhtml_branch_coverage=1 00:05:35.414 --rc genhtml_function_coverage=1 00:05:35.414 --rc genhtml_legend=1 00:05:35.414 --rc geninfo_all_blocks=1 00:05:35.414 --rc geninfo_unexecuted_blocks=1 00:05:35.415 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.415 ' 00:05:35.415 19:03:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:35.415 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.415 --rc genhtml_branch_coverage=1 00:05:35.415 --rc genhtml_function_coverage=1 00:05:35.415 --rc genhtml_legend=1 00:05:35.415 --rc geninfo_all_blocks=1 00:05:35.415 --rc geninfo_unexecuted_blocks=1 00:05:35.415 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.415 ' 00:05:35.415 19:03:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:35.415 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.415 --rc genhtml_branch_coverage=1 00:05:35.415 --rc genhtml_function_coverage=1 00:05:35.415 --rc genhtml_legend=1 00:05:35.415 --rc geninfo_all_blocks=1 00:05:35.415 --rc geninfo_unexecuted_blocks=1 00:05:35.415 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.415 ' 00:05:35.415 19:03:53 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:35.415 OK 00:05:35.415 19:03:53 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:35.415 00:05:35.415 real 0m0.184s 00:05:35.415 user 0m0.103s 00:05:35.415 sys 0m0.097s 00:05:35.415 19:03:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.415 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:35.415 ************************************ 00:05:35.415 END TEST rpc_client 00:05:35.415 ************************************ 00:05:35.415 19:03:53 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:35.415 19:03:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:35.415 19:03:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.415 19:03:53 -- common/autotest_common.sh@10 -- # set +x 00:05:35.415 ************************************ 00:05:35.415 START TEST json_config 00:05:35.415 ************************************ 00:05:35.415 19:03:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:35.675 19:03:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:35.675 19:03:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:35.675 19:03:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:35.675 19:03:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:35.675 19:03:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:35.675 19:03:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:35.675 19:03:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:35.675 19:03:54 -- scripts/common.sh@335 -- # IFS=.-: 00:05:35.675 19:03:54 -- scripts/common.sh@335 -- # read -ra ver1 00:05:35.675 19:03:54 -- scripts/common.sh@336 -- # IFS=.-: 00:05:35.675 19:03:54 -- scripts/common.sh@336 -- # read -ra ver2 00:05:35.675 19:03:54 -- scripts/common.sh@337 -- # local 'op=<' 00:05:35.675 19:03:54 -- scripts/common.sh@339 -- # ver1_l=2 00:05:35.675 19:03:54 -- scripts/common.sh@340 -- # ver2_l=1 00:05:35.675 19:03:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:35.675 19:03:54 -- scripts/common.sh@343 -- # case "$op" in 00:05:35.675 19:03:54 -- scripts/common.sh@344 -- # : 1 00:05:35.675 19:03:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:35.675 19:03:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:35.675 19:03:54 -- scripts/common.sh@364 -- # decimal 1 00:05:35.675 19:03:54 -- scripts/common.sh@352 -- # local d=1 00:05:35.675 19:03:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:35.675 19:03:54 -- scripts/common.sh@354 -- # echo 1 00:05:35.675 19:03:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:35.675 19:03:54 -- scripts/common.sh@365 -- # decimal 2 00:05:35.675 19:03:54 -- scripts/common.sh@352 -- # local d=2 00:05:35.675 19:03:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:35.675 19:03:54 -- scripts/common.sh@354 -- # echo 2 00:05:35.675 19:03:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:35.675 19:03:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:35.675 19:03:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:35.675 19:03:54 -- scripts/common.sh@367 -- # return 0 00:05:35.675 19:03:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:35.675 19:03:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:35.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.675 --rc genhtml_branch_coverage=1 00:05:35.675 --rc genhtml_function_coverage=1 00:05:35.675 --rc genhtml_legend=1 00:05:35.675 --rc geninfo_all_blocks=1 00:05:35.675 --rc geninfo_unexecuted_blocks=1 00:05:35.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.675 ' 00:05:35.675 19:03:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:35.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.675 --rc genhtml_branch_coverage=1 00:05:35.675 --rc genhtml_function_coverage=1 00:05:35.675 --rc genhtml_legend=1 00:05:35.675 --rc geninfo_all_blocks=1 00:05:35.675 --rc geninfo_unexecuted_blocks=1 00:05:35.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.675 ' 00:05:35.675 19:03:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:35.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.675 --rc genhtml_branch_coverage=1 00:05:35.675 --rc genhtml_function_coverage=1 00:05:35.675 --rc genhtml_legend=1 00:05:35.675 --rc geninfo_all_blocks=1 00:05:35.675 --rc geninfo_unexecuted_blocks=1 00:05:35.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.675 ' 00:05:35.675 19:03:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:35.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.675 --rc genhtml_branch_coverage=1 00:05:35.675 --rc genhtml_function_coverage=1 00:05:35.675 --rc genhtml_legend=1 00:05:35.676 --rc geninfo_all_blocks=1 00:05:35.676 --rc geninfo_unexecuted_blocks=1 00:05:35.676 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.676 ' 00:05:35.676 19:03:54 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:35.676 19:03:54 -- nvmf/common.sh@7 -- # uname -s 00:05:35.676 19:03:54 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:35.676 19:03:54 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:35.676 19:03:54 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:35.676 19:03:54 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:35.676 19:03:54 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:35.676 19:03:54 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:35.676 19:03:54 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:35.676 19:03:54 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:35.676 19:03:54 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:35.676 19:03:54 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:35.676 19:03:54 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:35.676 19:03:54 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:35.676 19:03:54 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:35.676 19:03:54 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:35.676 19:03:54 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:35.676 19:03:54 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:35.676 19:03:54 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:35.676 19:03:54 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:35.676 19:03:54 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:35.676 19:03:54 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.676 19:03:54 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.676 19:03:54 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.676 19:03:54 -- paths/export.sh@5 -- # export PATH 00:05:35.676 19:03:54 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.676 19:03:54 -- nvmf/common.sh@46 -- # : 0 00:05:35.676 19:03:54 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:35.676 19:03:54 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:35.676 19:03:54 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:35.676 19:03:54 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:35.676 19:03:54 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:35.676 19:03:54 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:35.676 19:03:54 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:35.676 19:03:54 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:35.676 19:03:54 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:35.676 19:03:54 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:35.676 19:03:54 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:35.676 19:03:54 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:35.676 19:03:54 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:35.676 WARNING: No tests are enabled so not running JSON configuration tests 00:05:35.676 19:03:54 -- json_config/json_config.sh@27 -- # exit 0 00:05:35.676 00:05:35.676 real 0m0.188s 00:05:35.676 user 0m0.114s 00:05:35.676 sys 0m0.082s 00:05:35.676 19:03:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.676 19:03:54 -- common/autotest_common.sh@10 -- # set +x 00:05:35.676 ************************************ 00:05:35.676 END TEST json_config 00:05:35.676 ************************************ 00:05:35.676 19:03:54 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:35.676 19:03:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:35.676 19:03:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.676 19:03:54 -- common/autotest_common.sh@10 -- # set +x 00:05:35.676 ************************************ 00:05:35.676 START TEST json_config_extra_key 00:05:35.676 ************************************ 00:05:35.676 19:03:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:35.676 19:03:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:35.676 19:03:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:35.676 19:03:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:35.937 19:03:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:35.937 19:03:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:35.937 19:03:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:35.937 19:03:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:35.937 19:03:54 -- scripts/common.sh@335 -- # IFS=.-: 00:05:35.937 19:03:54 -- scripts/common.sh@335 -- # read -ra ver1 00:05:35.937 19:03:54 -- scripts/common.sh@336 -- # IFS=.-: 00:05:35.937 19:03:54 -- scripts/common.sh@336 -- # read -ra ver2 00:05:35.937 19:03:54 -- scripts/common.sh@337 -- # local 'op=<' 00:05:35.937 19:03:54 -- scripts/common.sh@339 -- # ver1_l=2 00:05:35.937 19:03:54 -- scripts/common.sh@340 -- # ver2_l=1 00:05:35.937 19:03:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:35.937 19:03:54 -- scripts/common.sh@343 -- # case "$op" in 00:05:35.937 19:03:54 -- scripts/common.sh@344 -- # : 1 00:05:35.937 19:03:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:35.937 19:03:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:35.937 19:03:54 -- scripts/common.sh@364 -- # decimal 1 00:05:35.937 19:03:54 -- scripts/common.sh@352 -- # local d=1 00:05:35.937 19:03:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:35.937 19:03:54 -- scripts/common.sh@354 -- # echo 1 00:05:35.937 19:03:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:35.937 19:03:54 -- scripts/common.sh@365 -- # decimal 2 00:05:35.937 19:03:54 -- scripts/common.sh@352 -- # local d=2 00:05:35.937 19:03:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:35.937 19:03:54 -- scripts/common.sh@354 -- # echo 2 00:05:35.937 19:03:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:35.937 19:03:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:35.937 19:03:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:35.937 19:03:54 -- scripts/common.sh@367 -- # return 0 00:05:35.937 19:03:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:35.937 19:03:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:35.937 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.937 --rc genhtml_branch_coverage=1 00:05:35.937 --rc genhtml_function_coverage=1 00:05:35.937 --rc genhtml_legend=1 00:05:35.937 --rc geninfo_all_blocks=1 00:05:35.937 --rc geninfo_unexecuted_blocks=1 00:05:35.937 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.937 ' 00:05:35.937 19:03:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:35.937 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.937 --rc genhtml_branch_coverage=1 00:05:35.937 --rc genhtml_function_coverage=1 00:05:35.937 --rc genhtml_legend=1 00:05:35.937 --rc geninfo_all_blocks=1 00:05:35.937 --rc geninfo_unexecuted_blocks=1 00:05:35.937 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.937 ' 00:05:35.937 19:03:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:35.937 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.937 --rc genhtml_branch_coverage=1 00:05:35.937 --rc genhtml_function_coverage=1 00:05:35.937 --rc genhtml_legend=1 00:05:35.937 --rc geninfo_all_blocks=1 00:05:35.937 --rc geninfo_unexecuted_blocks=1 00:05:35.937 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.937 ' 00:05:35.937 19:03:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:35.937 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.937 --rc genhtml_branch_coverage=1 00:05:35.937 --rc genhtml_function_coverage=1 00:05:35.937 --rc genhtml_legend=1 00:05:35.937 --rc geninfo_all_blocks=1 00:05:35.937 --rc geninfo_unexecuted_blocks=1 00:05:35.937 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.937 ' 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:35.937 19:03:54 -- nvmf/common.sh@7 -- # uname -s 00:05:35.937 19:03:54 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:35.937 19:03:54 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:35.937 19:03:54 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:35.937 19:03:54 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:35.937 19:03:54 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:35.937 19:03:54 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:35.937 19:03:54 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:35.937 19:03:54 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:35.937 19:03:54 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:35.937 19:03:54 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:35.937 19:03:54 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:35.937 19:03:54 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:35.937 19:03:54 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:35.937 19:03:54 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:35.937 19:03:54 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:35.937 19:03:54 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:35.937 19:03:54 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:35.937 19:03:54 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:35.937 19:03:54 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:35.937 19:03:54 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.937 19:03:54 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.937 19:03:54 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.937 19:03:54 -- paths/export.sh@5 -- # export PATH 00:05:35.937 19:03:54 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:35.937 19:03:54 -- nvmf/common.sh@46 -- # : 0 00:05:35.937 19:03:54 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:35.937 19:03:54 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:35.937 19:03:54 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:35.937 19:03:54 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:35.937 19:03:54 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:35.937 19:03:54 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:35.937 19:03:54 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:35.937 19:03:54 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:35.937 INFO: launching applications... 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:35.937 19:03:54 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:35.938 19:03:54 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:35.938 19:03:54 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:35.938 19:03:54 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:35.938 19:03:54 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=1276266 00:05:35.938 19:03:54 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:35.938 Waiting for target to run... 00:05:35.938 19:03:54 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 1276266 /var/tmp/spdk_tgt.sock 00:05:35.938 19:03:54 -- common/autotest_common.sh@829 -- # '[' -z 1276266 ']' 00:05:35.938 19:03:54 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:35.938 19:03:54 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:35.938 19:03:54 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:35.938 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:35.938 19:03:54 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:35.938 19:03:54 -- common/autotest_common.sh@10 -- # set +x 00:05:35.938 19:03:54 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:35.938 [2024-11-18 19:03:54.389896] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:35.938 [2024-11-18 19:03:54.389961] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1276266 ] 00:05:35.938 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.197 [2024-11-18 19:03:54.672201] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.197 [2024-11-18 19:03:54.736042] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:36.197 [2024-11-18 19:03:54.736143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.765 19:03:55 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:36.765 19:03:55 -- common/autotest_common.sh@862 -- # return 0 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:36.765 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:36.765 INFO: shutting down applications... 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 1276266 ]] 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 1276266 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:36.765 19:03:55 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1276266 00:05:36.766 19:03:55 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1276266 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:37.335 SPDK target shutdown done 00:05:37.335 19:03:55 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:37.335 Success 00:05:37.335 00:05:37.335 real 0m1.527s 00:05:37.335 user 0m1.261s 00:05:37.335 sys 0m0.410s 00:05:37.335 19:03:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.335 19:03:55 -- common/autotest_common.sh@10 -- # set +x 00:05:37.335 ************************************ 00:05:37.335 END TEST json_config_extra_key 00:05:37.335 ************************************ 00:05:37.335 19:03:55 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:37.335 19:03:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:37.335 19:03:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.335 19:03:55 -- common/autotest_common.sh@10 -- # set +x 00:05:37.335 ************************************ 00:05:37.335 START TEST alias_rpc 00:05:37.335 ************************************ 00:05:37.335 19:03:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:37.335 * Looking for test storage... 00:05:37.335 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:37.335 19:03:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:37.335 19:03:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:37.335 19:03:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:37.335 19:03:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:37.335 19:03:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:37.335 19:03:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:37.335 19:03:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:37.335 19:03:55 -- scripts/common.sh@335 -- # IFS=.-: 00:05:37.335 19:03:55 -- scripts/common.sh@335 -- # read -ra ver1 00:05:37.335 19:03:55 -- scripts/common.sh@336 -- # IFS=.-: 00:05:37.335 19:03:55 -- scripts/common.sh@336 -- # read -ra ver2 00:05:37.335 19:03:55 -- scripts/common.sh@337 -- # local 'op=<' 00:05:37.335 19:03:55 -- scripts/common.sh@339 -- # ver1_l=2 00:05:37.335 19:03:55 -- scripts/common.sh@340 -- # ver2_l=1 00:05:37.335 19:03:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:37.335 19:03:55 -- scripts/common.sh@343 -- # case "$op" in 00:05:37.335 19:03:55 -- scripts/common.sh@344 -- # : 1 00:05:37.335 19:03:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:37.335 19:03:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:37.335 19:03:55 -- scripts/common.sh@364 -- # decimal 1 00:05:37.335 19:03:55 -- scripts/common.sh@352 -- # local d=1 00:05:37.335 19:03:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:37.335 19:03:55 -- scripts/common.sh@354 -- # echo 1 00:05:37.335 19:03:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:37.335 19:03:55 -- scripts/common.sh@365 -- # decimal 2 00:05:37.335 19:03:55 -- scripts/common.sh@352 -- # local d=2 00:05:37.335 19:03:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:37.594 19:03:55 -- scripts/common.sh@354 -- # echo 2 00:05:37.594 19:03:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:37.594 19:03:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:37.594 19:03:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:37.594 19:03:55 -- scripts/common.sh@367 -- # return 0 00:05:37.594 19:03:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:37.594 19:03:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:37.594 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.594 --rc genhtml_branch_coverage=1 00:05:37.594 --rc genhtml_function_coverage=1 00:05:37.594 --rc genhtml_legend=1 00:05:37.594 --rc geninfo_all_blocks=1 00:05:37.594 --rc geninfo_unexecuted_blocks=1 00:05:37.594 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.594 ' 00:05:37.594 19:03:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:37.594 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.594 --rc genhtml_branch_coverage=1 00:05:37.594 --rc genhtml_function_coverage=1 00:05:37.594 --rc genhtml_legend=1 00:05:37.594 --rc geninfo_all_blocks=1 00:05:37.594 --rc geninfo_unexecuted_blocks=1 00:05:37.594 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.594 ' 00:05:37.594 19:03:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:37.594 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.594 --rc genhtml_branch_coverage=1 00:05:37.595 --rc genhtml_function_coverage=1 00:05:37.595 --rc genhtml_legend=1 00:05:37.595 --rc geninfo_all_blocks=1 00:05:37.595 --rc geninfo_unexecuted_blocks=1 00:05:37.595 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.595 ' 00:05:37.595 19:03:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:37.595 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.595 --rc genhtml_branch_coverage=1 00:05:37.595 --rc genhtml_function_coverage=1 00:05:37.595 --rc genhtml_legend=1 00:05:37.595 --rc geninfo_all_blocks=1 00:05:37.595 --rc geninfo_unexecuted_blocks=1 00:05:37.595 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.595 ' 00:05:37.595 19:03:55 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:37.595 19:03:55 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1276593 00:05:37.595 19:03:55 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:37.595 19:03:55 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1276593 00:05:37.595 19:03:55 -- common/autotest_common.sh@829 -- # '[' -z 1276593 ']' 00:05:37.595 19:03:55 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:37.595 19:03:55 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:37.595 19:03:55 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:37.595 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:37.595 19:03:55 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:37.595 19:03:55 -- common/autotest_common.sh@10 -- # set +x 00:05:37.595 [2024-11-18 19:03:55.966685] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:37.595 [2024-11-18 19:03:55.966755] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1276593 ] 00:05:37.595 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.595 [2024-11-18 19:03:56.034991] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.595 [2024-11-18 19:03:56.105570] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:37.595 [2024-11-18 19:03:56.105689] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.531 19:03:56 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:38.531 19:03:56 -- common/autotest_common.sh@862 -- # return 0 00:05:38.531 19:03:56 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:38.531 19:03:57 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1276593 00:05:38.531 19:03:57 -- common/autotest_common.sh@936 -- # '[' -z 1276593 ']' 00:05:38.531 19:03:57 -- common/autotest_common.sh@940 -- # kill -0 1276593 00:05:38.531 19:03:57 -- common/autotest_common.sh@941 -- # uname 00:05:38.531 19:03:57 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:38.531 19:03:57 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1276593 00:05:38.531 19:03:57 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:38.531 19:03:57 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:38.531 19:03:57 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1276593' 00:05:38.531 killing process with pid 1276593 00:05:38.531 19:03:57 -- common/autotest_common.sh@955 -- # kill 1276593 00:05:38.531 19:03:57 -- common/autotest_common.sh@960 -- # wait 1276593 00:05:38.790 00:05:38.790 real 0m1.606s 00:05:38.790 user 0m1.730s 00:05:38.790 sys 0m0.460s 00:05:38.790 19:03:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.790 19:03:57 -- common/autotest_common.sh@10 -- # set +x 00:05:38.790 ************************************ 00:05:38.790 END TEST alias_rpc 00:05:38.790 ************************************ 00:05:39.050 19:03:57 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:39.050 19:03:57 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:39.050 19:03:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.050 19:03:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.050 19:03:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.050 ************************************ 00:05:39.050 START TEST spdkcli_tcp 00:05:39.050 ************************************ 00:05:39.050 19:03:57 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:39.050 * Looking for test storage... 00:05:39.050 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:39.050 19:03:57 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:39.050 19:03:57 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:39.050 19:03:57 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:39.050 19:03:57 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:39.050 19:03:57 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:39.050 19:03:57 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:39.050 19:03:57 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:39.050 19:03:57 -- scripts/common.sh@335 -- # IFS=.-: 00:05:39.050 19:03:57 -- scripts/common.sh@335 -- # read -ra ver1 00:05:39.050 19:03:57 -- scripts/common.sh@336 -- # IFS=.-: 00:05:39.050 19:03:57 -- scripts/common.sh@336 -- # read -ra ver2 00:05:39.050 19:03:57 -- scripts/common.sh@337 -- # local 'op=<' 00:05:39.050 19:03:57 -- scripts/common.sh@339 -- # ver1_l=2 00:05:39.050 19:03:57 -- scripts/common.sh@340 -- # ver2_l=1 00:05:39.050 19:03:57 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:39.050 19:03:57 -- scripts/common.sh@343 -- # case "$op" in 00:05:39.050 19:03:57 -- scripts/common.sh@344 -- # : 1 00:05:39.050 19:03:57 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:39.050 19:03:57 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:39.050 19:03:57 -- scripts/common.sh@364 -- # decimal 1 00:05:39.050 19:03:57 -- scripts/common.sh@352 -- # local d=1 00:05:39.050 19:03:57 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:39.050 19:03:57 -- scripts/common.sh@354 -- # echo 1 00:05:39.050 19:03:57 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:39.050 19:03:57 -- scripts/common.sh@365 -- # decimal 2 00:05:39.050 19:03:57 -- scripts/common.sh@352 -- # local d=2 00:05:39.050 19:03:57 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:39.050 19:03:57 -- scripts/common.sh@354 -- # echo 2 00:05:39.050 19:03:57 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:39.050 19:03:57 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:39.050 19:03:57 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:39.050 19:03:57 -- scripts/common.sh@367 -- # return 0 00:05:39.050 19:03:57 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:39.051 19:03:57 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:39.051 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:39.051 --rc genhtml_branch_coverage=1 00:05:39.051 --rc genhtml_function_coverage=1 00:05:39.051 --rc genhtml_legend=1 00:05:39.051 --rc geninfo_all_blocks=1 00:05:39.051 --rc geninfo_unexecuted_blocks=1 00:05:39.051 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:39.051 ' 00:05:39.051 19:03:57 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:39.051 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:39.051 --rc genhtml_branch_coverage=1 00:05:39.051 --rc genhtml_function_coverage=1 00:05:39.051 --rc genhtml_legend=1 00:05:39.051 --rc geninfo_all_blocks=1 00:05:39.051 --rc geninfo_unexecuted_blocks=1 00:05:39.051 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:39.051 ' 00:05:39.051 19:03:57 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:39.051 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:39.051 --rc genhtml_branch_coverage=1 00:05:39.051 --rc genhtml_function_coverage=1 00:05:39.051 --rc genhtml_legend=1 00:05:39.051 --rc geninfo_all_blocks=1 00:05:39.051 --rc geninfo_unexecuted_blocks=1 00:05:39.051 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:39.051 ' 00:05:39.051 19:03:57 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:39.051 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:39.051 --rc genhtml_branch_coverage=1 00:05:39.051 --rc genhtml_function_coverage=1 00:05:39.051 --rc genhtml_legend=1 00:05:39.051 --rc geninfo_all_blocks=1 00:05:39.051 --rc geninfo_unexecuted_blocks=1 00:05:39.051 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:39.051 ' 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:39.051 19:03:57 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:39.051 19:03:57 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:39.051 19:03:57 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:39.051 19:03:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1276926 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:39.051 19:03:57 -- spdkcli/tcp.sh@27 -- # waitforlisten 1276926 00:05:39.051 19:03:57 -- common/autotest_common.sh@829 -- # '[' -z 1276926 ']' 00:05:39.051 19:03:57 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:39.051 19:03:57 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:39.051 19:03:57 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:39.051 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:39.051 19:03:57 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:39.051 19:03:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.051 [2024-11-18 19:03:57.644507] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:39.051 [2024-11-18 19:03:57.644576] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1276926 ] 00:05:39.310 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.310 [2024-11-18 19:03:57.708932] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:39.310 [2024-11-18 19:03:57.783697] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:39.310 [2024-11-18 19:03:57.783824] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:39.310 [2024-11-18 19:03:57.783825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.879 19:03:58 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:39.879 19:03:58 -- common/autotest_common.sh@862 -- # return 0 00:05:39.879 19:03:58 -- spdkcli/tcp.sh@31 -- # socat_pid=1277115 00:05:39.879 19:03:58 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:39.879 19:03:58 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:40.139 [ 00:05:40.139 "spdk_get_version", 00:05:40.139 "rpc_get_methods", 00:05:40.139 "trace_get_info", 00:05:40.139 "trace_get_tpoint_group_mask", 00:05:40.139 "trace_disable_tpoint_group", 00:05:40.139 "trace_enable_tpoint_group", 00:05:40.139 "trace_clear_tpoint_mask", 00:05:40.139 "trace_set_tpoint_mask", 00:05:40.139 "vfu_tgt_set_base_path", 00:05:40.139 "framework_get_pci_devices", 00:05:40.139 "framework_get_config", 00:05:40.139 "framework_get_subsystems", 00:05:40.139 "iobuf_get_stats", 00:05:40.139 "iobuf_set_options", 00:05:40.139 "sock_set_default_impl", 00:05:40.139 "sock_impl_set_options", 00:05:40.139 "sock_impl_get_options", 00:05:40.139 "vmd_rescan", 00:05:40.139 "vmd_remove_device", 00:05:40.139 "vmd_enable", 00:05:40.139 "accel_get_stats", 00:05:40.139 "accel_set_options", 00:05:40.139 "accel_set_driver", 00:05:40.139 "accel_crypto_key_destroy", 00:05:40.139 "accel_crypto_keys_get", 00:05:40.139 "accel_crypto_key_create", 00:05:40.139 "accel_assign_opc", 00:05:40.139 "accel_get_module_info", 00:05:40.139 "accel_get_opc_assignments", 00:05:40.139 "notify_get_notifications", 00:05:40.139 "notify_get_types", 00:05:40.139 "bdev_get_histogram", 00:05:40.139 "bdev_enable_histogram", 00:05:40.139 "bdev_set_qos_limit", 00:05:40.139 "bdev_set_qd_sampling_period", 00:05:40.139 "bdev_get_bdevs", 00:05:40.139 "bdev_reset_iostat", 00:05:40.139 "bdev_get_iostat", 00:05:40.139 "bdev_examine", 00:05:40.139 "bdev_wait_for_examine", 00:05:40.139 "bdev_set_options", 00:05:40.139 "scsi_get_devices", 00:05:40.139 "thread_set_cpumask", 00:05:40.139 "framework_get_scheduler", 00:05:40.139 "framework_set_scheduler", 00:05:40.139 "framework_get_reactors", 00:05:40.139 "thread_get_io_channels", 00:05:40.139 "thread_get_pollers", 00:05:40.139 "thread_get_stats", 00:05:40.139 "framework_monitor_context_switch", 00:05:40.139 "spdk_kill_instance", 00:05:40.139 "log_enable_timestamps", 00:05:40.139 "log_get_flags", 00:05:40.139 "log_clear_flag", 00:05:40.139 "log_set_flag", 00:05:40.139 "log_get_level", 00:05:40.139 "log_set_level", 00:05:40.139 "log_get_print_level", 00:05:40.139 "log_set_print_level", 00:05:40.139 "framework_enable_cpumask_locks", 00:05:40.139 "framework_disable_cpumask_locks", 00:05:40.139 "framework_wait_init", 00:05:40.139 "framework_start_init", 00:05:40.139 "virtio_blk_create_transport", 00:05:40.139 "virtio_blk_get_transports", 00:05:40.139 "vhost_controller_set_coalescing", 00:05:40.139 "vhost_get_controllers", 00:05:40.139 "vhost_delete_controller", 00:05:40.139 "vhost_create_blk_controller", 00:05:40.139 "vhost_scsi_controller_remove_target", 00:05:40.139 "vhost_scsi_controller_add_target", 00:05:40.139 "vhost_start_scsi_controller", 00:05:40.139 "vhost_create_scsi_controller", 00:05:40.139 "ublk_recover_disk", 00:05:40.139 "ublk_get_disks", 00:05:40.139 "ublk_stop_disk", 00:05:40.139 "ublk_start_disk", 00:05:40.139 "ublk_destroy_target", 00:05:40.139 "ublk_create_target", 00:05:40.139 "nbd_get_disks", 00:05:40.139 "nbd_stop_disk", 00:05:40.139 "nbd_start_disk", 00:05:40.139 "env_dpdk_get_mem_stats", 00:05:40.139 "nvmf_subsystem_get_listeners", 00:05:40.139 "nvmf_subsystem_get_qpairs", 00:05:40.139 "nvmf_subsystem_get_controllers", 00:05:40.139 "nvmf_get_stats", 00:05:40.139 "nvmf_get_transports", 00:05:40.139 "nvmf_create_transport", 00:05:40.139 "nvmf_get_targets", 00:05:40.139 "nvmf_delete_target", 00:05:40.139 "nvmf_create_target", 00:05:40.139 "nvmf_subsystem_allow_any_host", 00:05:40.139 "nvmf_subsystem_remove_host", 00:05:40.139 "nvmf_subsystem_add_host", 00:05:40.139 "nvmf_subsystem_remove_ns", 00:05:40.139 "nvmf_subsystem_add_ns", 00:05:40.139 "nvmf_subsystem_listener_set_ana_state", 00:05:40.139 "nvmf_discovery_get_referrals", 00:05:40.139 "nvmf_discovery_remove_referral", 00:05:40.139 "nvmf_discovery_add_referral", 00:05:40.139 "nvmf_subsystem_remove_listener", 00:05:40.140 "nvmf_subsystem_add_listener", 00:05:40.140 "nvmf_delete_subsystem", 00:05:40.140 "nvmf_create_subsystem", 00:05:40.140 "nvmf_get_subsystems", 00:05:40.140 "nvmf_set_crdt", 00:05:40.140 "nvmf_set_config", 00:05:40.140 "nvmf_set_max_subsystems", 00:05:40.140 "iscsi_set_options", 00:05:40.140 "iscsi_get_auth_groups", 00:05:40.140 "iscsi_auth_group_remove_secret", 00:05:40.140 "iscsi_auth_group_add_secret", 00:05:40.140 "iscsi_delete_auth_group", 00:05:40.140 "iscsi_create_auth_group", 00:05:40.140 "iscsi_set_discovery_auth", 00:05:40.140 "iscsi_get_options", 00:05:40.140 "iscsi_target_node_request_logout", 00:05:40.140 "iscsi_target_node_set_redirect", 00:05:40.140 "iscsi_target_node_set_auth", 00:05:40.140 "iscsi_target_node_add_lun", 00:05:40.140 "iscsi_get_connections", 00:05:40.140 "iscsi_portal_group_set_auth", 00:05:40.140 "iscsi_start_portal_group", 00:05:40.140 "iscsi_delete_portal_group", 00:05:40.140 "iscsi_create_portal_group", 00:05:40.140 "iscsi_get_portal_groups", 00:05:40.140 "iscsi_delete_target_node", 00:05:40.140 "iscsi_target_node_remove_pg_ig_maps", 00:05:40.140 "iscsi_target_node_add_pg_ig_maps", 00:05:40.140 "iscsi_create_target_node", 00:05:40.140 "iscsi_get_target_nodes", 00:05:40.140 "iscsi_delete_initiator_group", 00:05:40.140 "iscsi_initiator_group_remove_initiators", 00:05:40.140 "iscsi_initiator_group_add_initiators", 00:05:40.140 "iscsi_create_initiator_group", 00:05:40.140 "iscsi_get_initiator_groups", 00:05:40.140 "vfu_virtio_create_scsi_endpoint", 00:05:40.140 "vfu_virtio_scsi_remove_target", 00:05:40.140 "vfu_virtio_scsi_add_target", 00:05:40.140 "vfu_virtio_create_blk_endpoint", 00:05:40.140 "vfu_virtio_delete_endpoint", 00:05:40.140 "iaa_scan_accel_module", 00:05:40.140 "dsa_scan_accel_module", 00:05:40.140 "ioat_scan_accel_module", 00:05:40.140 "accel_error_inject_error", 00:05:40.140 "bdev_iscsi_delete", 00:05:40.140 "bdev_iscsi_create", 00:05:40.140 "bdev_iscsi_set_options", 00:05:40.140 "bdev_virtio_attach_controller", 00:05:40.140 "bdev_virtio_scsi_get_devices", 00:05:40.140 "bdev_virtio_detach_controller", 00:05:40.140 "bdev_virtio_blk_set_hotplug", 00:05:40.140 "bdev_ftl_set_property", 00:05:40.140 "bdev_ftl_get_properties", 00:05:40.140 "bdev_ftl_get_stats", 00:05:40.140 "bdev_ftl_unmap", 00:05:40.140 "bdev_ftl_unload", 00:05:40.140 "bdev_ftl_delete", 00:05:40.140 "bdev_ftl_load", 00:05:40.140 "bdev_ftl_create", 00:05:40.140 "bdev_aio_delete", 00:05:40.140 "bdev_aio_rescan", 00:05:40.140 "bdev_aio_create", 00:05:40.140 "blobfs_create", 00:05:40.140 "blobfs_detect", 00:05:40.140 "blobfs_set_cache_size", 00:05:40.140 "bdev_zone_block_delete", 00:05:40.140 "bdev_zone_block_create", 00:05:40.140 "bdev_delay_delete", 00:05:40.140 "bdev_delay_create", 00:05:40.140 "bdev_delay_update_latency", 00:05:40.140 "bdev_split_delete", 00:05:40.140 "bdev_split_create", 00:05:40.140 "bdev_error_inject_error", 00:05:40.140 "bdev_error_delete", 00:05:40.140 "bdev_error_create", 00:05:40.140 "bdev_raid_set_options", 00:05:40.140 "bdev_raid_remove_base_bdev", 00:05:40.140 "bdev_raid_add_base_bdev", 00:05:40.140 "bdev_raid_delete", 00:05:40.140 "bdev_raid_create", 00:05:40.140 "bdev_raid_get_bdevs", 00:05:40.140 "bdev_lvol_grow_lvstore", 00:05:40.140 "bdev_lvol_get_lvols", 00:05:40.140 "bdev_lvol_get_lvstores", 00:05:40.140 "bdev_lvol_delete", 00:05:40.140 "bdev_lvol_set_read_only", 00:05:40.140 "bdev_lvol_resize", 00:05:40.140 "bdev_lvol_decouple_parent", 00:05:40.140 "bdev_lvol_inflate", 00:05:40.140 "bdev_lvol_rename", 00:05:40.140 "bdev_lvol_clone_bdev", 00:05:40.140 "bdev_lvol_clone", 00:05:40.140 "bdev_lvol_snapshot", 00:05:40.140 "bdev_lvol_create", 00:05:40.140 "bdev_lvol_delete_lvstore", 00:05:40.140 "bdev_lvol_rename_lvstore", 00:05:40.140 "bdev_lvol_create_lvstore", 00:05:40.140 "bdev_passthru_delete", 00:05:40.140 "bdev_passthru_create", 00:05:40.140 "bdev_nvme_cuse_unregister", 00:05:40.140 "bdev_nvme_cuse_register", 00:05:40.140 "bdev_opal_new_user", 00:05:40.140 "bdev_opal_set_lock_state", 00:05:40.140 "bdev_opal_delete", 00:05:40.140 "bdev_opal_get_info", 00:05:40.140 "bdev_opal_create", 00:05:40.140 "bdev_nvme_opal_revert", 00:05:40.140 "bdev_nvme_opal_init", 00:05:40.140 "bdev_nvme_send_cmd", 00:05:40.140 "bdev_nvme_get_path_iostat", 00:05:40.140 "bdev_nvme_get_mdns_discovery_info", 00:05:40.140 "bdev_nvme_stop_mdns_discovery", 00:05:40.140 "bdev_nvme_start_mdns_discovery", 00:05:40.140 "bdev_nvme_set_multipath_policy", 00:05:40.140 "bdev_nvme_set_preferred_path", 00:05:40.140 "bdev_nvme_get_io_paths", 00:05:40.140 "bdev_nvme_remove_error_injection", 00:05:40.140 "bdev_nvme_add_error_injection", 00:05:40.140 "bdev_nvme_get_discovery_info", 00:05:40.140 "bdev_nvme_stop_discovery", 00:05:40.140 "bdev_nvme_start_discovery", 00:05:40.140 "bdev_nvme_get_controller_health_info", 00:05:40.140 "bdev_nvme_disable_controller", 00:05:40.140 "bdev_nvme_enable_controller", 00:05:40.140 "bdev_nvme_reset_controller", 00:05:40.140 "bdev_nvme_get_transport_statistics", 00:05:40.140 "bdev_nvme_apply_firmware", 00:05:40.140 "bdev_nvme_detach_controller", 00:05:40.140 "bdev_nvme_get_controllers", 00:05:40.140 "bdev_nvme_attach_controller", 00:05:40.140 "bdev_nvme_set_hotplug", 00:05:40.140 "bdev_nvme_set_options", 00:05:40.140 "bdev_null_resize", 00:05:40.140 "bdev_null_delete", 00:05:40.140 "bdev_null_create", 00:05:40.140 "bdev_malloc_delete", 00:05:40.140 "bdev_malloc_create" 00:05:40.140 ] 00:05:40.140 19:03:58 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:40.140 19:03:58 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:40.140 19:03:58 -- common/autotest_common.sh@10 -- # set +x 00:05:40.140 19:03:58 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:40.140 19:03:58 -- spdkcli/tcp.sh@38 -- # killprocess 1276926 00:05:40.140 19:03:58 -- common/autotest_common.sh@936 -- # '[' -z 1276926 ']' 00:05:40.140 19:03:58 -- common/autotest_common.sh@940 -- # kill -0 1276926 00:05:40.140 19:03:58 -- common/autotest_common.sh@941 -- # uname 00:05:40.140 19:03:58 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:40.140 19:03:58 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1276926 00:05:40.400 19:03:58 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:40.400 19:03:58 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:40.400 19:03:58 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1276926' 00:05:40.400 killing process with pid 1276926 00:05:40.400 19:03:58 -- common/autotest_common.sh@955 -- # kill 1276926 00:05:40.400 19:03:58 -- common/autotest_common.sh@960 -- # wait 1276926 00:05:40.659 00:05:40.659 real 0m1.636s 00:05:40.659 user 0m2.965s 00:05:40.659 sys 0m0.495s 00:05:40.659 19:03:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:40.659 19:03:59 -- common/autotest_common.sh@10 -- # set +x 00:05:40.659 ************************************ 00:05:40.659 END TEST spdkcli_tcp 00:05:40.659 ************************************ 00:05:40.659 19:03:59 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:40.659 19:03:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:40.659 19:03:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.659 19:03:59 -- common/autotest_common.sh@10 -- # set +x 00:05:40.659 ************************************ 00:05:40.659 START TEST dpdk_mem_utility 00:05:40.659 ************************************ 00:05:40.659 19:03:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:40.659 * Looking for test storage... 00:05:40.659 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:40.659 19:03:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:40.659 19:03:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:40.659 19:03:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:40.918 19:03:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:40.918 19:03:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:40.918 19:03:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:40.918 19:03:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:40.918 19:03:59 -- scripts/common.sh@335 -- # IFS=.-: 00:05:40.918 19:03:59 -- scripts/common.sh@335 -- # read -ra ver1 00:05:40.918 19:03:59 -- scripts/common.sh@336 -- # IFS=.-: 00:05:40.918 19:03:59 -- scripts/common.sh@336 -- # read -ra ver2 00:05:40.918 19:03:59 -- scripts/common.sh@337 -- # local 'op=<' 00:05:40.918 19:03:59 -- scripts/common.sh@339 -- # ver1_l=2 00:05:40.918 19:03:59 -- scripts/common.sh@340 -- # ver2_l=1 00:05:40.918 19:03:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:40.918 19:03:59 -- scripts/common.sh@343 -- # case "$op" in 00:05:40.918 19:03:59 -- scripts/common.sh@344 -- # : 1 00:05:40.918 19:03:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:40.918 19:03:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:40.918 19:03:59 -- scripts/common.sh@364 -- # decimal 1 00:05:40.918 19:03:59 -- scripts/common.sh@352 -- # local d=1 00:05:40.918 19:03:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:40.918 19:03:59 -- scripts/common.sh@354 -- # echo 1 00:05:40.918 19:03:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:40.918 19:03:59 -- scripts/common.sh@365 -- # decimal 2 00:05:40.918 19:03:59 -- scripts/common.sh@352 -- # local d=2 00:05:40.918 19:03:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:40.918 19:03:59 -- scripts/common.sh@354 -- # echo 2 00:05:40.918 19:03:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:40.918 19:03:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:40.918 19:03:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:40.918 19:03:59 -- scripts/common.sh@367 -- # return 0 00:05:40.918 19:03:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:40.918 19:03:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:40.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.918 --rc genhtml_branch_coverage=1 00:05:40.918 --rc genhtml_function_coverage=1 00:05:40.918 --rc genhtml_legend=1 00:05:40.918 --rc geninfo_all_blocks=1 00:05:40.918 --rc geninfo_unexecuted_blocks=1 00:05:40.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.918 ' 00:05:40.918 19:03:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:40.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.918 --rc genhtml_branch_coverage=1 00:05:40.918 --rc genhtml_function_coverage=1 00:05:40.918 --rc genhtml_legend=1 00:05:40.918 --rc geninfo_all_blocks=1 00:05:40.918 --rc geninfo_unexecuted_blocks=1 00:05:40.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.918 ' 00:05:40.918 19:03:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:40.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.918 --rc genhtml_branch_coverage=1 00:05:40.918 --rc genhtml_function_coverage=1 00:05:40.918 --rc genhtml_legend=1 00:05:40.918 --rc geninfo_all_blocks=1 00:05:40.918 --rc geninfo_unexecuted_blocks=1 00:05:40.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.918 ' 00:05:40.918 19:03:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:40.918 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.918 --rc genhtml_branch_coverage=1 00:05:40.918 --rc genhtml_function_coverage=1 00:05:40.918 --rc genhtml_legend=1 00:05:40.918 --rc geninfo_all_blocks=1 00:05:40.918 --rc geninfo_unexecuted_blocks=1 00:05:40.918 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.918 ' 00:05:40.918 19:03:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:40.918 19:03:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1277273 00:05:40.918 19:03:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1277273 00:05:40.918 19:03:59 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:40.918 19:03:59 -- common/autotest_common.sh@829 -- # '[' -z 1277273 ']' 00:05:40.918 19:03:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:40.918 19:03:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:40.918 19:03:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:40.918 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:40.918 19:03:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:40.918 19:03:59 -- common/autotest_common.sh@10 -- # set +x 00:05:40.918 [2024-11-18 19:03:59.307309] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:40.918 [2024-11-18 19:03:59.307381] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1277273 ] 00:05:40.918 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.918 [2024-11-18 19:03:59.374019] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.918 [2024-11-18 19:03:59.443118] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:40.918 [2024-11-18 19:03:59.443225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.857 19:04:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:41.857 19:04:00 -- common/autotest_common.sh@862 -- # return 0 00:05:41.857 19:04:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:41.857 19:04:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:41.857 19:04:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:41.857 19:04:00 -- common/autotest_common.sh@10 -- # set +x 00:05:41.857 { 00:05:41.857 "filename": "/tmp/spdk_mem_dump.txt" 00:05:41.857 } 00:05:41.857 19:04:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:41.857 19:04:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:41.857 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:41.857 1 heaps totaling size 814.000000 MiB 00:05:41.857 size: 814.000000 MiB heap id: 0 00:05:41.857 end heaps---------- 00:05:41.857 8 mempools totaling size 598.116089 MiB 00:05:41.857 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:41.857 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:41.857 size: 84.521057 MiB name: bdev_io_1277273 00:05:41.857 size: 51.011292 MiB name: evtpool_1277273 00:05:41.857 size: 50.003479 MiB name: msgpool_1277273 00:05:41.857 size: 21.763794 MiB name: PDU_Pool 00:05:41.857 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:41.857 size: 0.026123 MiB name: Session_Pool 00:05:41.857 end mempools------- 00:05:41.857 6 memzones totaling size 4.142822 MiB 00:05:41.857 size: 1.000366 MiB name: RG_ring_0_1277273 00:05:41.857 size: 1.000366 MiB name: RG_ring_1_1277273 00:05:41.857 size: 1.000366 MiB name: RG_ring_4_1277273 00:05:41.857 size: 1.000366 MiB name: RG_ring_5_1277273 00:05:41.857 size: 0.125366 MiB name: RG_ring_2_1277273 00:05:41.857 size: 0.015991 MiB name: RG_ring_3_1277273 00:05:41.857 end memzones------- 00:05:41.857 19:04:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:41.857 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:41.857 list of free elements. size: 12.519348 MiB 00:05:41.857 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:41.857 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:41.857 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:41.857 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:41.857 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:41.857 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:41.857 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:41.857 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:41.857 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:41.857 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:41.857 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:41.857 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:41.857 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:41.857 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:41.857 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:41.857 list of standard malloc elements. size: 199.218079 MiB 00:05:41.857 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:41.857 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:41.857 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:41.857 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:41.857 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:41.857 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:41.857 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:41.857 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:41.857 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:41.857 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:41.857 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:41.857 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:41.857 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:41.857 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:41.857 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:41.857 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:41.857 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:41.857 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:41.857 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:41.857 list of memzone associated elements. size: 602.262573 MiB 00:05:41.857 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:41.857 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:41.858 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:41.858 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:41.858 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:41.858 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1277273_0 00:05:41.858 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:41.858 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1277273_0 00:05:41.858 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:41.858 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1277273_0 00:05:41.858 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:41.858 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:41.858 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:41.858 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:41.858 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:41.858 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1277273 00:05:41.858 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:41.858 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1277273 00:05:41.858 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:41.858 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1277273 00:05:41.858 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:41.858 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:41.858 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:41.858 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:41.858 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:41.858 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:41.858 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:41.858 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:41.858 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:41.858 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1277273 00:05:41.858 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:41.858 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1277273 00:05:41.858 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:41.858 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1277273 00:05:41.858 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:41.858 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1277273 00:05:41.858 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:41.858 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1277273 00:05:41.858 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:41.858 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:41.858 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:41.858 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:41.858 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:41.858 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:41.858 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:41.858 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1277273 00:05:41.858 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:41.858 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:41.858 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:41.858 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:41.858 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:41.858 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1277273 00:05:41.858 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:41.858 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:41.858 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:41.858 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1277273 00:05:41.858 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:41.858 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1277273 00:05:41.858 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:41.858 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:41.858 19:04:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:41.858 19:04:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1277273 00:05:41.858 19:04:00 -- common/autotest_common.sh@936 -- # '[' -z 1277273 ']' 00:05:41.858 19:04:00 -- common/autotest_common.sh@940 -- # kill -0 1277273 00:05:41.858 19:04:00 -- common/autotest_common.sh@941 -- # uname 00:05:41.858 19:04:00 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:41.858 19:04:00 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1277273 00:05:41.858 19:04:00 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:41.858 19:04:00 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:41.858 19:04:00 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1277273' 00:05:41.858 killing process with pid 1277273 00:05:41.858 19:04:00 -- common/autotest_common.sh@955 -- # kill 1277273 00:05:41.858 19:04:00 -- common/autotest_common.sh@960 -- # wait 1277273 00:05:42.117 00:05:42.117 real 0m1.527s 00:05:42.117 user 0m1.583s 00:05:42.117 sys 0m0.458s 00:05:42.117 19:04:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.117 19:04:00 -- common/autotest_common.sh@10 -- # set +x 00:05:42.117 ************************************ 00:05:42.117 END TEST dpdk_mem_utility 00:05:42.117 ************************************ 00:05:42.117 19:04:00 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:42.117 19:04:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.117 19:04:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.117 19:04:00 -- common/autotest_common.sh@10 -- # set +x 00:05:42.117 ************************************ 00:05:42.117 START TEST event 00:05:42.117 ************************************ 00:05:42.117 19:04:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:42.377 * Looking for test storage... 00:05:42.377 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:42.377 19:04:00 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:42.377 19:04:00 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:42.377 19:04:00 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:42.377 19:04:00 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:42.377 19:04:00 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:42.377 19:04:00 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:42.377 19:04:00 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:42.377 19:04:00 -- scripts/common.sh@335 -- # IFS=.-: 00:05:42.377 19:04:00 -- scripts/common.sh@335 -- # read -ra ver1 00:05:42.377 19:04:00 -- scripts/common.sh@336 -- # IFS=.-: 00:05:42.377 19:04:00 -- scripts/common.sh@336 -- # read -ra ver2 00:05:42.377 19:04:00 -- scripts/common.sh@337 -- # local 'op=<' 00:05:42.377 19:04:00 -- scripts/common.sh@339 -- # ver1_l=2 00:05:42.377 19:04:00 -- scripts/common.sh@340 -- # ver2_l=1 00:05:42.377 19:04:00 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:42.377 19:04:00 -- scripts/common.sh@343 -- # case "$op" in 00:05:42.377 19:04:00 -- scripts/common.sh@344 -- # : 1 00:05:42.377 19:04:00 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:42.377 19:04:00 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:42.377 19:04:00 -- scripts/common.sh@364 -- # decimal 1 00:05:42.377 19:04:00 -- scripts/common.sh@352 -- # local d=1 00:05:42.377 19:04:00 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:42.377 19:04:00 -- scripts/common.sh@354 -- # echo 1 00:05:42.377 19:04:00 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:42.377 19:04:00 -- scripts/common.sh@365 -- # decimal 2 00:05:42.377 19:04:00 -- scripts/common.sh@352 -- # local d=2 00:05:42.377 19:04:00 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:42.377 19:04:00 -- scripts/common.sh@354 -- # echo 2 00:05:42.377 19:04:00 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:42.377 19:04:00 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:42.377 19:04:00 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:42.377 19:04:00 -- scripts/common.sh@367 -- # return 0 00:05:42.377 19:04:00 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:42.377 19:04:00 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:42.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.377 --rc genhtml_branch_coverage=1 00:05:42.377 --rc genhtml_function_coverage=1 00:05:42.377 --rc genhtml_legend=1 00:05:42.377 --rc geninfo_all_blocks=1 00:05:42.377 --rc geninfo_unexecuted_blocks=1 00:05:42.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.377 ' 00:05:42.377 19:04:00 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:42.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.377 --rc genhtml_branch_coverage=1 00:05:42.377 --rc genhtml_function_coverage=1 00:05:42.377 --rc genhtml_legend=1 00:05:42.377 --rc geninfo_all_blocks=1 00:05:42.377 --rc geninfo_unexecuted_blocks=1 00:05:42.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.377 ' 00:05:42.377 19:04:00 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:42.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.377 --rc genhtml_branch_coverage=1 00:05:42.377 --rc genhtml_function_coverage=1 00:05:42.377 --rc genhtml_legend=1 00:05:42.377 --rc geninfo_all_blocks=1 00:05:42.377 --rc geninfo_unexecuted_blocks=1 00:05:42.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.377 ' 00:05:42.377 19:04:00 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:42.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.377 --rc genhtml_branch_coverage=1 00:05:42.377 --rc genhtml_function_coverage=1 00:05:42.377 --rc genhtml_legend=1 00:05:42.377 --rc geninfo_all_blocks=1 00:05:42.377 --rc geninfo_unexecuted_blocks=1 00:05:42.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.377 ' 00:05:42.377 19:04:00 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:42.377 19:04:00 -- bdev/nbd_common.sh@6 -- # set -e 00:05:42.377 19:04:00 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:42.377 19:04:00 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:42.377 19:04:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.377 19:04:00 -- common/autotest_common.sh@10 -- # set +x 00:05:42.377 ************************************ 00:05:42.377 START TEST event_perf 00:05:42.377 ************************************ 00:05:42.377 19:04:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:42.377 Running I/O for 1 seconds...[2024-11-18 19:04:00.875789] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:42.377 [2024-11-18 19:04:00.875875] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1277607 ] 00:05:42.377 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.377 [2024-11-18 19:04:00.946343] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:42.636 [2024-11-18 19:04:01.018325] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:42.636 [2024-11-18 19:04:01.018421] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:42.636 [2024-11-18 19:04:01.018508] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:42.636 [2024-11-18 19:04:01.018510] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.573 Running I/O for 1 seconds... 00:05:43.573 lcore 0: 190487 00:05:43.573 lcore 1: 190485 00:05:43.573 lcore 2: 190486 00:05:43.573 lcore 3: 190486 00:05:43.573 done. 00:05:43.573 00:05:43.573 real 0m1.225s 00:05:43.573 user 0m4.132s 00:05:43.573 sys 0m0.090s 00:05:43.573 19:04:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.573 19:04:02 -- common/autotest_common.sh@10 -- # set +x 00:05:43.573 ************************************ 00:05:43.573 END TEST event_perf 00:05:43.573 ************************************ 00:05:43.573 19:04:02 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:43.573 19:04:02 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:43.573 19:04:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.573 19:04:02 -- common/autotest_common.sh@10 -- # set +x 00:05:43.573 ************************************ 00:05:43.573 START TEST event_reactor 00:05:43.573 ************************************ 00:05:43.573 19:04:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:43.573 [2024-11-18 19:04:02.149137] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:43.573 [2024-11-18 19:04:02.149233] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1277900 ] 00:05:43.833 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.833 [2024-11-18 19:04:02.219814] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.833 [2024-11-18 19:04:02.286787] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.771 test_start 00:05:44.771 oneshot 00:05:44.771 tick 100 00:05:44.771 tick 100 00:05:44.771 tick 250 00:05:44.771 tick 100 00:05:44.771 tick 100 00:05:44.771 tick 100 00:05:44.771 tick 250 00:05:44.771 tick 500 00:05:44.771 tick 100 00:05:44.771 tick 100 00:05:44.771 tick 250 00:05:44.771 tick 100 00:05:44.771 tick 100 00:05:44.771 test_end 00:05:44.771 00:05:44.771 real 0m1.218s 00:05:44.771 user 0m1.138s 00:05:44.771 sys 0m0.076s 00:05:44.771 19:04:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:44.771 19:04:03 -- common/autotest_common.sh@10 -- # set +x 00:05:44.771 ************************************ 00:05:44.771 END TEST event_reactor 00:05:44.771 ************************************ 00:05:45.031 19:04:03 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:45.031 19:04:03 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:45.031 19:04:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.031 19:04:03 -- common/autotest_common.sh@10 -- # set +x 00:05:45.031 ************************************ 00:05:45.031 START TEST event_reactor_perf 00:05:45.031 ************************************ 00:05:45.031 19:04:03 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:45.031 [2024-11-18 19:04:03.409560] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:45.031 [2024-11-18 19:04:03.409670] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1278182 ] 00:05:45.031 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.031 [2024-11-18 19:04:03.479896] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.031 [2024-11-18 19:04:03.546594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.409 test_start 00:05:46.409 test_end 00:05:46.409 Performance: 968729 events per second 00:05:46.409 00:05:46.409 real 0m1.218s 00:05:46.409 user 0m1.131s 00:05:46.409 sys 0m0.083s 00:05:46.409 19:04:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:46.409 19:04:04 -- common/autotest_common.sh@10 -- # set +x 00:05:46.409 ************************************ 00:05:46.409 END TEST event_reactor_perf 00:05:46.409 ************************************ 00:05:46.409 19:04:04 -- event/event.sh@49 -- # uname -s 00:05:46.409 19:04:04 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:46.409 19:04:04 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:46.409 19:04:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:46.409 19:04:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.409 19:04:04 -- common/autotest_common.sh@10 -- # set +x 00:05:46.409 ************************************ 00:05:46.409 START TEST event_scheduler 00:05:46.409 ************************************ 00:05:46.409 19:04:04 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:46.409 * Looking for test storage... 00:05:46.409 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:46.409 19:04:04 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:46.409 19:04:04 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:46.409 19:04:04 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:46.409 19:04:04 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:46.409 19:04:04 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:46.409 19:04:04 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:46.409 19:04:04 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:46.409 19:04:04 -- scripts/common.sh@335 -- # IFS=.-: 00:05:46.409 19:04:04 -- scripts/common.sh@335 -- # read -ra ver1 00:05:46.409 19:04:04 -- scripts/common.sh@336 -- # IFS=.-: 00:05:46.410 19:04:04 -- scripts/common.sh@336 -- # read -ra ver2 00:05:46.410 19:04:04 -- scripts/common.sh@337 -- # local 'op=<' 00:05:46.410 19:04:04 -- scripts/common.sh@339 -- # ver1_l=2 00:05:46.410 19:04:04 -- scripts/common.sh@340 -- # ver2_l=1 00:05:46.410 19:04:04 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:46.410 19:04:04 -- scripts/common.sh@343 -- # case "$op" in 00:05:46.410 19:04:04 -- scripts/common.sh@344 -- # : 1 00:05:46.410 19:04:04 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:46.410 19:04:04 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:46.410 19:04:04 -- scripts/common.sh@364 -- # decimal 1 00:05:46.410 19:04:04 -- scripts/common.sh@352 -- # local d=1 00:05:46.410 19:04:04 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:46.410 19:04:04 -- scripts/common.sh@354 -- # echo 1 00:05:46.410 19:04:04 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:46.410 19:04:04 -- scripts/common.sh@365 -- # decimal 2 00:05:46.410 19:04:04 -- scripts/common.sh@352 -- # local d=2 00:05:46.410 19:04:04 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:46.410 19:04:04 -- scripts/common.sh@354 -- # echo 2 00:05:46.410 19:04:04 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:46.410 19:04:04 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:46.410 19:04:04 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:46.410 19:04:04 -- scripts/common.sh@367 -- # return 0 00:05:46.410 19:04:04 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:46.410 19:04:04 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:46.410 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.410 --rc genhtml_branch_coverage=1 00:05:46.410 --rc genhtml_function_coverage=1 00:05:46.410 --rc genhtml_legend=1 00:05:46.410 --rc geninfo_all_blocks=1 00:05:46.410 --rc geninfo_unexecuted_blocks=1 00:05:46.410 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.410 ' 00:05:46.410 19:04:04 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:46.410 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.410 --rc genhtml_branch_coverage=1 00:05:46.410 --rc genhtml_function_coverage=1 00:05:46.410 --rc genhtml_legend=1 00:05:46.410 --rc geninfo_all_blocks=1 00:05:46.410 --rc geninfo_unexecuted_blocks=1 00:05:46.410 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.410 ' 00:05:46.410 19:04:04 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:46.410 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.410 --rc genhtml_branch_coverage=1 00:05:46.410 --rc genhtml_function_coverage=1 00:05:46.410 --rc genhtml_legend=1 00:05:46.410 --rc geninfo_all_blocks=1 00:05:46.410 --rc geninfo_unexecuted_blocks=1 00:05:46.410 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.410 ' 00:05:46.410 19:04:04 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:46.410 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.410 --rc genhtml_branch_coverage=1 00:05:46.410 --rc genhtml_function_coverage=1 00:05:46.410 --rc genhtml_legend=1 00:05:46.410 --rc geninfo_all_blocks=1 00:05:46.410 --rc geninfo_unexecuted_blocks=1 00:05:46.410 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.410 ' 00:05:46.410 19:04:04 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:46.410 19:04:04 -- scheduler/scheduler.sh@35 -- # scheduler_pid=1278500 00:05:46.410 19:04:04 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:46.410 19:04:04 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:46.410 19:04:04 -- scheduler/scheduler.sh@37 -- # waitforlisten 1278500 00:05:46.410 19:04:04 -- common/autotest_common.sh@829 -- # '[' -z 1278500 ']' 00:05:46.410 19:04:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:46.410 19:04:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:46.410 19:04:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:46.410 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:46.410 19:04:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:46.410 19:04:04 -- common/autotest_common.sh@10 -- # set +x 00:05:46.410 [2024-11-18 19:04:04.875131] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:46.410 [2024-11-18 19:04:04.875213] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1278500 ] 00:05:46.410 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.410 [2024-11-18 19:04:04.940727] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:46.669 [2024-11-18 19:04:05.013021] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.669 [2024-11-18 19:04:05.013106] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:46.669 [2024-11-18 19:04:05.013212] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:46.669 [2024-11-18 19:04:05.013214] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:47.237 19:04:05 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:47.237 19:04:05 -- common/autotest_common.sh@862 -- # return 0 00:05:47.237 19:04:05 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:47.237 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.237 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.237 POWER: Env isn't set yet! 00:05:47.237 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:47.237 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:47.237 POWER: Cannot set governor of lcore 0 to userspace 00:05:47.237 POWER: Attempting to initialise PSTAT power management... 00:05:47.237 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:47.237 POWER: Initialized successfully for lcore 0 power management 00:05:47.237 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:47.237 POWER: Initialized successfully for lcore 1 power management 00:05:47.237 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:47.237 POWER: Initialized successfully for lcore 2 power management 00:05:47.237 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:47.237 POWER: Initialized successfully for lcore 3 power management 00:05:47.237 [2024-11-18 19:04:05.753762] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:47.238 [2024-11-18 19:04:05.753778] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:47.238 [2024-11-18 19:04:05.753789] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:47.238 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.238 19:04:05 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:47.238 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.238 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.238 [2024-11-18 19:04:05.820673] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:47.238 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.238 19:04:05 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:47.238 19:04:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:47.238 19:04:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:47.238 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.238 ************************************ 00:05:47.238 START TEST scheduler_create_thread 00:05:47.238 ************************************ 00:05:47.238 19:04:05 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:47.238 19:04:05 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:47.238 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.238 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 2 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 3 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 4 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 5 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 6 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 7 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 8 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 9 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 10 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.497 19:04:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:47.497 19:04:05 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:47.497 19:04:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.497 19:04:05 -- common/autotest_common.sh@10 -- # set +x 00:05:48.435 19:04:06 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:48.435 19:04:06 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:48.435 19:04:06 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:48.435 19:04:06 -- common/autotest_common.sh@10 -- # set +x 00:05:49.814 19:04:08 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:49.814 19:04:08 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:49.814 19:04:08 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:49.814 19:04:08 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:49.814 19:04:08 -- common/autotest_common.sh@10 -- # set +x 00:05:50.753 19:04:09 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:50.753 00:05:50.753 real 0m3.382s 00:05:50.753 user 0m0.023s 00:05:50.753 sys 0m0.007s 00:05:50.753 19:04:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.753 19:04:09 -- common/autotest_common.sh@10 -- # set +x 00:05:50.753 ************************************ 00:05:50.753 END TEST scheduler_create_thread 00:05:50.753 ************************************ 00:05:50.753 19:04:09 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:50.753 19:04:09 -- scheduler/scheduler.sh@46 -- # killprocess 1278500 00:05:50.753 19:04:09 -- common/autotest_common.sh@936 -- # '[' -z 1278500 ']' 00:05:50.753 19:04:09 -- common/autotest_common.sh@940 -- # kill -0 1278500 00:05:50.753 19:04:09 -- common/autotest_common.sh@941 -- # uname 00:05:50.753 19:04:09 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:50.753 19:04:09 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1278500 00:05:50.753 19:04:09 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:50.753 19:04:09 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:50.753 19:04:09 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1278500' 00:05:50.753 killing process with pid 1278500 00:05:50.754 19:04:09 -- common/autotest_common.sh@955 -- # kill 1278500 00:05:50.754 19:04:09 -- common/autotest_common.sh@960 -- # wait 1278500 00:05:51.014 [2024-11-18 19:04:09.592390] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:51.273 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:51.273 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:51.273 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:51.273 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:51.273 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:51.273 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:51.273 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:51.273 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:51.273 00:05:51.273 real 0m5.154s 00:05:51.273 user 0m10.573s 00:05:51.273 sys 0m0.433s 00:05:51.273 19:04:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:51.273 19:04:09 -- common/autotest_common.sh@10 -- # set +x 00:05:51.273 ************************************ 00:05:51.273 END TEST event_scheduler 00:05:51.273 ************************************ 00:05:51.273 19:04:09 -- event/event.sh@51 -- # modprobe -n nbd 00:05:51.273 19:04:09 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:51.273 19:04:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.273 19:04:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.273 19:04:09 -- common/autotest_common.sh@10 -- # set +x 00:05:51.273 ************************************ 00:05:51.273 START TEST app_repeat 00:05:51.273 ************************************ 00:05:51.273 19:04:09 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:05:51.273 19:04:09 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:51.273 19:04:09 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:51.273 19:04:09 -- event/event.sh@13 -- # local nbd_list 00:05:51.532 19:04:09 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:51.532 19:04:09 -- event/event.sh@14 -- # local bdev_list 00:05:51.532 19:04:09 -- event/event.sh@15 -- # local repeat_times=4 00:05:51.532 19:04:09 -- event/event.sh@17 -- # modprobe nbd 00:05:51.532 19:04:09 -- event/event.sh@19 -- # repeat_pid=1279367 00:05:51.532 19:04:09 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:51.532 19:04:09 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:51.532 19:04:09 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1279367' 00:05:51.532 Process app_repeat pid: 1279367 00:05:51.532 19:04:09 -- event/event.sh@23 -- # for i in {0..2} 00:05:51.532 19:04:09 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:51.532 spdk_app_start Round 0 00:05:51.532 19:04:09 -- event/event.sh@25 -- # waitforlisten 1279367 /var/tmp/spdk-nbd.sock 00:05:51.532 19:04:09 -- common/autotest_common.sh@829 -- # '[' -z 1279367 ']' 00:05:51.532 19:04:09 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:51.532 19:04:09 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:51.532 19:04:09 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:51.532 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:51.532 19:04:09 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:51.532 19:04:09 -- common/autotest_common.sh@10 -- # set +x 00:05:51.532 [2024-11-18 19:04:09.903678] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:51.532 [2024-11-18 19:04:09.903768] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1279367 ] 00:05:51.532 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.532 [2024-11-18 19:04:09.974266] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:51.532 [2024-11-18 19:04:10.050033] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:51.532 [2024-11-18 19:04:10.050035] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.468 19:04:10 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:52.468 19:04:10 -- common/autotest_common.sh@862 -- # return 0 00:05:52.468 19:04:10 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:52.468 Malloc0 00:05:52.468 19:04:10 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:52.727 Malloc1 00:05:52.727 19:04:11 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@12 -- # local i 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:52.727 /dev/nbd0 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:52.727 19:04:11 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:52.727 19:04:11 -- common/autotest_common.sh@867 -- # local i 00:05:52.727 19:04:11 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:52.727 19:04:11 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:52.727 19:04:11 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:52.727 19:04:11 -- common/autotest_common.sh@871 -- # break 00:05:52.727 19:04:11 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:52.727 19:04:11 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:52.727 19:04:11 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:52.727 1+0 records in 00:05:52.727 1+0 records out 00:05:52.727 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227344 s, 18.0 MB/s 00:05:52.727 19:04:11 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:52.727 19:04:11 -- common/autotest_common.sh@884 -- # size=4096 00:05:52.727 19:04:11 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:52.727 19:04:11 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:52.727 19:04:11 -- common/autotest_common.sh@887 -- # return 0 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:52.727 19:04:11 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:52.986 /dev/nbd1 00:05:52.986 19:04:11 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:52.986 19:04:11 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:52.986 19:04:11 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:52.986 19:04:11 -- common/autotest_common.sh@867 -- # local i 00:05:52.986 19:04:11 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:52.986 19:04:11 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:52.986 19:04:11 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:52.986 19:04:11 -- common/autotest_common.sh@871 -- # break 00:05:52.986 19:04:11 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:52.986 19:04:11 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:52.986 19:04:11 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:52.986 1+0 records in 00:05:52.986 1+0 records out 00:05:52.986 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251853 s, 16.3 MB/s 00:05:52.986 19:04:11 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:52.986 19:04:11 -- common/autotest_common.sh@884 -- # size=4096 00:05:52.986 19:04:11 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:52.986 19:04:11 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:52.986 19:04:11 -- common/autotest_common.sh@887 -- # return 0 00:05:52.986 19:04:11 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:52.986 19:04:11 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:52.986 19:04:11 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:52.986 19:04:11 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:52.986 19:04:11 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:53.245 { 00:05:53.245 "nbd_device": "/dev/nbd0", 00:05:53.245 "bdev_name": "Malloc0" 00:05:53.245 }, 00:05:53.245 { 00:05:53.245 "nbd_device": "/dev/nbd1", 00:05:53.245 "bdev_name": "Malloc1" 00:05:53.245 } 00:05:53.245 ]' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:53.245 { 00:05:53.245 "nbd_device": "/dev/nbd0", 00:05:53.245 "bdev_name": "Malloc0" 00:05:53.245 }, 00:05:53.245 { 00:05:53.245 "nbd_device": "/dev/nbd1", 00:05:53.245 "bdev_name": "Malloc1" 00:05:53.245 } 00:05:53.245 ]' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:53.245 /dev/nbd1' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:53.245 /dev/nbd1' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@65 -- # count=2 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@95 -- # count=2 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:53.245 256+0 records in 00:05:53.245 256+0 records out 00:05:53.245 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00963179 s, 109 MB/s 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:53.245 256+0 records in 00:05:53.245 256+0 records out 00:05:53.245 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0192934 s, 54.3 MB/s 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:53.245 256+0 records in 00:05:53.245 256+0 records out 00:05:53.245 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206072 s, 50.9 MB/s 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@51 -- # local i 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:53.245 19:04:11 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@41 -- # break 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@45 -- # return 0 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:53.505 19:04:12 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@41 -- # break 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@45 -- # return 0 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:53.765 19:04:12 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@65 -- # true 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@65 -- # count=0 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@104 -- # count=0 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:54.024 19:04:12 -- bdev/nbd_common.sh@109 -- # return 0 00:05:54.024 19:04:12 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:54.283 19:04:12 -- event/event.sh@35 -- # sleep 3 00:05:54.283 [2024-11-18 19:04:12.846419] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:54.542 [2024-11-18 19:04:12.910327] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:54.542 [2024-11-18 19:04:12.910329] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.542 [2024-11-18 19:04:12.950808] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:54.542 [2024-11-18 19:04:12.950851] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:57.079 19:04:15 -- event/event.sh@23 -- # for i in {0..2} 00:05:57.079 19:04:15 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:57.079 spdk_app_start Round 1 00:05:57.079 19:04:15 -- event/event.sh@25 -- # waitforlisten 1279367 /var/tmp/spdk-nbd.sock 00:05:57.079 19:04:15 -- common/autotest_common.sh@829 -- # '[' -z 1279367 ']' 00:05:57.079 19:04:15 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:57.079 19:04:15 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:57.079 19:04:15 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:57.079 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:57.079 19:04:15 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:57.079 19:04:15 -- common/autotest_common.sh@10 -- # set +x 00:05:57.338 19:04:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:57.338 19:04:15 -- common/autotest_common.sh@862 -- # return 0 00:05:57.338 19:04:15 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:57.598 Malloc0 00:05:57.598 19:04:16 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:57.598 Malloc1 00:05:57.598 19:04:16 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@12 -- # local i 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.598 19:04:16 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:57.858 /dev/nbd0 00:05:57.858 19:04:16 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:57.858 19:04:16 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:57.858 19:04:16 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:57.858 19:04:16 -- common/autotest_common.sh@867 -- # local i 00:05:57.858 19:04:16 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:57.858 19:04:16 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:57.858 19:04:16 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:57.858 19:04:16 -- common/autotest_common.sh@871 -- # break 00:05:57.858 19:04:16 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:57.858 19:04:16 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:57.858 19:04:16 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:57.858 1+0 records in 00:05:57.858 1+0 records out 00:05:57.858 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236906 s, 17.3 MB/s 00:05:57.858 19:04:16 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.858 19:04:16 -- common/autotest_common.sh@884 -- # size=4096 00:05:57.858 19:04:16 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.858 19:04:16 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:57.858 19:04:16 -- common/autotest_common.sh@887 -- # return 0 00:05:57.858 19:04:16 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:57.858 19:04:16 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.858 19:04:16 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:58.117 /dev/nbd1 00:05:58.117 19:04:16 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:58.117 19:04:16 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:58.117 19:04:16 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:58.117 19:04:16 -- common/autotest_common.sh@867 -- # local i 00:05:58.117 19:04:16 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:58.117 19:04:16 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:58.117 19:04:16 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:58.117 19:04:16 -- common/autotest_common.sh@871 -- # break 00:05:58.117 19:04:16 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:58.117 19:04:16 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:58.117 19:04:16 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:58.117 1+0 records in 00:05:58.117 1+0 records out 00:05:58.117 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021694 s, 18.9 MB/s 00:05:58.117 19:04:16 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:58.117 19:04:16 -- common/autotest_common.sh@884 -- # size=4096 00:05:58.117 19:04:16 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:58.117 19:04:16 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:58.117 19:04:16 -- common/autotest_common.sh@887 -- # return 0 00:05:58.117 19:04:16 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:58.117 19:04:16 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:58.117 19:04:16 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:58.117 19:04:16 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.117 19:04:16 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:58.448 { 00:05:58.448 "nbd_device": "/dev/nbd0", 00:05:58.448 "bdev_name": "Malloc0" 00:05:58.448 }, 00:05:58.448 { 00:05:58.448 "nbd_device": "/dev/nbd1", 00:05:58.448 "bdev_name": "Malloc1" 00:05:58.448 } 00:05:58.448 ]' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:58.448 { 00:05:58.448 "nbd_device": "/dev/nbd0", 00:05:58.448 "bdev_name": "Malloc0" 00:05:58.448 }, 00:05:58.448 { 00:05:58.448 "nbd_device": "/dev/nbd1", 00:05:58.448 "bdev_name": "Malloc1" 00:05:58.448 } 00:05:58.448 ]' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:58.448 /dev/nbd1' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:58.448 /dev/nbd1' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@65 -- # count=2 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@95 -- # count=2 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:58.448 256+0 records in 00:05:58.448 256+0 records out 00:05:58.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116234 s, 90.2 MB/s 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:58.448 256+0 records in 00:05:58.448 256+0 records out 00:05:58.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199125 s, 52.7 MB/s 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:58.448 256+0 records in 00:05:58.448 256+0 records out 00:05:58.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0216154 s, 48.5 MB/s 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@51 -- # local i 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:58.448 19:04:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@41 -- # break 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@45 -- # return 0 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:58.718 19:04:17 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@41 -- # break 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@45 -- # return 0 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@65 -- # true 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@65 -- # count=0 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@104 -- # count=0 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:58.982 19:04:17 -- bdev/nbd_common.sh@109 -- # return 0 00:05:58.982 19:04:17 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:59.241 19:04:17 -- event/event.sh@35 -- # sleep 3 00:05:59.500 [2024-11-18 19:04:17.938215] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:59.500 [2024-11-18 19:04:18.001638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:59.500 [2024-11-18 19:04:18.001639] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.500 [2024-11-18 19:04:18.041522] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:59.500 [2024-11-18 19:04:18.041567] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:02.788 19:04:20 -- event/event.sh@23 -- # for i in {0..2} 00:06:02.789 19:04:20 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:02.789 spdk_app_start Round 2 00:06:02.789 19:04:20 -- event/event.sh@25 -- # waitforlisten 1279367 /var/tmp/spdk-nbd.sock 00:06:02.789 19:04:20 -- common/autotest_common.sh@829 -- # '[' -z 1279367 ']' 00:06:02.789 19:04:20 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:02.789 19:04:20 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:02.789 19:04:20 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:02.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:02.789 19:04:20 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:02.789 19:04:20 -- common/autotest_common.sh@10 -- # set +x 00:06:02.789 19:04:20 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:02.789 19:04:20 -- common/autotest_common.sh@862 -- # return 0 00:06:02.789 19:04:20 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.789 Malloc0 00:06:02.789 19:04:21 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.789 Malloc1 00:06:02.789 19:04:21 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@12 -- # local i 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.789 19:04:21 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:03.048 /dev/nbd0 00:06:03.048 19:04:21 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:03.048 19:04:21 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:03.048 19:04:21 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:03.048 19:04:21 -- common/autotest_common.sh@867 -- # local i 00:06:03.048 19:04:21 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:03.048 19:04:21 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:03.048 19:04:21 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:03.048 19:04:21 -- common/autotest_common.sh@871 -- # break 00:06:03.048 19:04:21 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:03.048 19:04:21 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:03.048 19:04:21 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:03.048 1+0 records in 00:06:03.048 1+0 records out 00:06:03.048 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023087 s, 17.7 MB/s 00:06:03.048 19:04:21 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.048 19:04:21 -- common/autotest_common.sh@884 -- # size=4096 00:06:03.048 19:04:21 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.048 19:04:21 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:03.048 19:04:21 -- common/autotest_common.sh@887 -- # return 0 00:06:03.048 19:04:21 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:03.048 19:04:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.048 19:04:21 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:03.308 /dev/nbd1 00:06:03.308 19:04:21 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:03.308 19:04:21 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:03.308 19:04:21 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:03.308 19:04:21 -- common/autotest_common.sh@867 -- # local i 00:06:03.308 19:04:21 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:03.308 19:04:21 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:03.308 19:04:21 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:03.308 19:04:21 -- common/autotest_common.sh@871 -- # break 00:06:03.308 19:04:21 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:03.308 19:04:21 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:03.308 19:04:21 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:03.308 1+0 records in 00:06:03.308 1+0 records out 00:06:03.308 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254839 s, 16.1 MB/s 00:06:03.308 19:04:21 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.308 19:04:21 -- common/autotest_common.sh@884 -- # size=4096 00:06:03.308 19:04:21 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.308 19:04:21 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:03.308 19:04:21 -- common/autotest_common.sh@887 -- # return 0 00:06:03.308 19:04:21 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:03.308 19:04:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.308 19:04:21 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:03.308 19:04:21 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.308 19:04:21 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:03.567 { 00:06:03.567 "nbd_device": "/dev/nbd0", 00:06:03.567 "bdev_name": "Malloc0" 00:06:03.567 }, 00:06:03.567 { 00:06:03.567 "nbd_device": "/dev/nbd1", 00:06:03.567 "bdev_name": "Malloc1" 00:06:03.567 } 00:06:03.567 ]' 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:03.567 { 00:06:03.567 "nbd_device": "/dev/nbd0", 00:06:03.567 "bdev_name": "Malloc0" 00:06:03.567 }, 00:06:03.567 { 00:06:03.567 "nbd_device": "/dev/nbd1", 00:06:03.567 "bdev_name": "Malloc1" 00:06:03.567 } 00:06:03.567 ]' 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:03.567 /dev/nbd1' 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:03.567 /dev/nbd1' 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@65 -- # count=2 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@95 -- # count=2 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:03.567 19:04:21 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:03.568 19:04:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:03.568 19:04:21 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:03.568 19:04:21 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:03.568 256+0 records in 00:06:03.568 256+0 records out 00:06:03.568 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011171 s, 93.9 MB/s 00:06:03.568 19:04:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:03.568 19:04:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:03.568 256+0 records in 00:06:03.568 256+0 records out 00:06:03.568 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195135 s, 53.7 MB/s 00:06:03.568 19:04:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:03.568 19:04:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:03.568 256+0 records in 00:06:03.568 256+0 records out 00:06:03.568 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0193773 s, 54.1 MB/s 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@51 -- # local i 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:03.568 19:04:22 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@41 -- # break 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@41 -- # break 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.827 19:04:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@65 -- # true 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@65 -- # count=0 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@104 -- # count=0 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:04.087 19:04:22 -- bdev/nbd_common.sh@109 -- # return 0 00:06:04.087 19:04:22 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:04.346 19:04:22 -- event/event.sh@35 -- # sleep 3 00:06:04.606 [2024-11-18 19:04:23.036603] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:04.606 [2024-11-18 19:04:23.100260] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:04.606 [2024-11-18 19:04:23.100261] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.606 [2024-11-18 19:04:23.140625] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:04.606 [2024-11-18 19:04:23.140668] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:07.896 19:04:25 -- event/event.sh@38 -- # waitforlisten 1279367 /var/tmp/spdk-nbd.sock 00:06:07.896 19:04:25 -- common/autotest_common.sh@829 -- # '[' -z 1279367 ']' 00:06:07.896 19:04:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:07.896 19:04:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:07.896 19:04:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:07.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:07.896 19:04:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:07.896 19:04:25 -- common/autotest_common.sh@10 -- # set +x 00:06:07.896 19:04:26 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:07.896 19:04:26 -- common/autotest_common.sh@862 -- # return 0 00:06:07.896 19:04:26 -- event/event.sh@39 -- # killprocess 1279367 00:06:07.896 19:04:26 -- common/autotest_common.sh@936 -- # '[' -z 1279367 ']' 00:06:07.896 19:04:26 -- common/autotest_common.sh@940 -- # kill -0 1279367 00:06:07.896 19:04:26 -- common/autotest_common.sh@941 -- # uname 00:06:07.896 19:04:26 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:07.896 19:04:26 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1279367 00:06:07.896 19:04:26 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:07.896 19:04:26 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:07.896 19:04:26 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1279367' 00:06:07.896 killing process with pid 1279367 00:06:07.896 19:04:26 -- common/autotest_common.sh@955 -- # kill 1279367 00:06:07.896 19:04:26 -- common/autotest_common.sh@960 -- # wait 1279367 00:06:07.896 spdk_app_start is called in Round 0. 00:06:07.896 Shutdown signal received, stop current app iteration 00:06:07.896 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:07.896 spdk_app_start is called in Round 1. 00:06:07.896 Shutdown signal received, stop current app iteration 00:06:07.896 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:07.896 spdk_app_start is called in Round 2. 00:06:07.896 Shutdown signal received, stop current app iteration 00:06:07.896 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:07.896 spdk_app_start is called in Round 3. 00:06:07.896 Shutdown signal received, stop current app iteration 00:06:07.896 19:04:26 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:07.896 19:04:26 -- event/event.sh@42 -- # return 0 00:06:07.896 00:06:07.896 real 0m16.393s 00:06:07.896 user 0m34.970s 00:06:07.896 sys 0m2.990s 00:06:07.896 19:04:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:07.896 19:04:26 -- common/autotest_common.sh@10 -- # set +x 00:06:07.896 ************************************ 00:06:07.896 END TEST app_repeat 00:06:07.896 ************************************ 00:06:07.896 19:04:26 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:07.896 19:04:26 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:07.896 19:04:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:07.896 19:04:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.896 19:04:26 -- common/autotest_common.sh@10 -- # set +x 00:06:07.896 ************************************ 00:06:07.896 START TEST cpu_locks 00:06:07.896 ************************************ 00:06:07.896 19:04:26 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:07.896 * Looking for test storage... 00:06:07.896 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:07.896 19:04:26 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:07.896 19:04:26 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:07.896 19:04:26 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:07.896 19:04:26 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:07.896 19:04:26 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:07.896 19:04:26 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:07.896 19:04:26 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:07.896 19:04:26 -- scripts/common.sh@335 -- # IFS=.-: 00:06:07.896 19:04:26 -- scripts/common.sh@335 -- # read -ra ver1 00:06:07.896 19:04:26 -- scripts/common.sh@336 -- # IFS=.-: 00:06:07.896 19:04:26 -- scripts/common.sh@336 -- # read -ra ver2 00:06:07.896 19:04:26 -- scripts/common.sh@337 -- # local 'op=<' 00:06:07.896 19:04:26 -- scripts/common.sh@339 -- # ver1_l=2 00:06:07.896 19:04:26 -- scripts/common.sh@340 -- # ver2_l=1 00:06:07.896 19:04:26 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:07.896 19:04:26 -- scripts/common.sh@343 -- # case "$op" in 00:06:07.896 19:04:26 -- scripts/common.sh@344 -- # : 1 00:06:07.896 19:04:26 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:07.896 19:04:26 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:07.896 19:04:26 -- scripts/common.sh@364 -- # decimal 1 00:06:07.896 19:04:26 -- scripts/common.sh@352 -- # local d=1 00:06:07.896 19:04:26 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:07.896 19:04:26 -- scripts/common.sh@354 -- # echo 1 00:06:07.896 19:04:26 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:07.896 19:04:26 -- scripts/common.sh@365 -- # decimal 2 00:06:07.896 19:04:26 -- scripts/common.sh@352 -- # local d=2 00:06:07.896 19:04:26 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:07.896 19:04:26 -- scripts/common.sh@354 -- # echo 2 00:06:07.896 19:04:26 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:07.896 19:04:26 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:07.896 19:04:26 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:07.896 19:04:26 -- scripts/common.sh@367 -- # return 0 00:06:07.896 19:04:26 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:07.896 19:04:26 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:07.896 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.896 --rc genhtml_branch_coverage=1 00:06:07.896 --rc genhtml_function_coverage=1 00:06:07.896 --rc genhtml_legend=1 00:06:07.896 --rc geninfo_all_blocks=1 00:06:07.896 --rc geninfo_unexecuted_blocks=1 00:06:07.896 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.896 ' 00:06:07.896 19:04:26 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:07.896 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.897 --rc genhtml_branch_coverage=1 00:06:07.897 --rc genhtml_function_coverage=1 00:06:07.897 --rc genhtml_legend=1 00:06:07.897 --rc geninfo_all_blocks=1 00:06:07.897 --rc geninfo_unexecuted_blocks=1 00:06:07.897 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.897 ' 00:06:07.897 19:04:26 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:07.897 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.897 --rc genhtml_branch_coverage=1 00:06:07.897 --rc genhtml_function_coverage=1 00:06:07.897 --rc genhtml_legend=1 00:06:07.897 --rc geninfo_all_blocks=1 00:06:07.897 --rc geninfo_unexecuted_blocks=1 00:06:07.897 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.897 ' 00:06:07.897 19:04:26 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:07.897 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:07.897 --rc genhtml_branch_coverage=1 00:06:07.897 --rc genhtml_function_coverage=1 00:06:07.897 --rc genhtml_legend=1 00:06:07.897 --rc geninfo_all_blocks=1 00:06:07.897 --rc geninfo_unexecuted_blocks=1 00:06:07.897 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:07.897 ' 00:06:07.897 19:04:26 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:07.897 19:04:26 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:07.897 19:04:26 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:07.897 19:04:26 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:07.897 19:04:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:07.897 19:04:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.897 19:04:26 -- common/autotest_common.sh@10 -- # set +x 00:06:08.156 ************************************ 00:06:08.156 START TEST default_locks 00:06:08.156 ************************************ 00:06:08.156 19:04:26 -- common/autotest_common.sh@1114 -- # default_locks 00:06:08.156 19:04:26 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1282569 00:06:08.156 19:04:26 -- event/cpu_locks.sh@47 -- # waitforlisten 1282569 00:06:08.156 19:04:26 -- common/autotest_common.sh@829 -- # '[' -z 1282569 ']' 00:06:08.156 19:04:26 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:08.156 19:04:26 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:08.156 19:04:26 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:08.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:08.156 19:04:26 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:08.156 19:04:26 -- common/autotest_common.sh@10 -- # set +x 00:06:08.156 19:04:26 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:08.156 [2024-11-18 19:04:26.528010] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:08.156 [2024-11-18 19:04:26.528093] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1282569 ] 00:06:08.156 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.156 [2024-11-18 19:04:26.597555] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.156 [2024-11-18 19:04:26.671493] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:08.156 [2024-11-18 19:04:26.671599] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.094 19:04:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:09.094 19:04:27 -- common/autotest_common.sh@862 -- # return 0 00:06:09.094 19:04:27 -- event/cpu_locks.sh@49 -- # locks_exist 1282569 00:06:09.094 19:04:27 -- event/cpu_locks.sh@22 -- # lslocks -p 1282569 00:06:09.094 19:04:27 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:09.663 lslocks: write error 00:06:09.663 19:04:28 -- event/cpu_locks.sh@50 -- # killprocess 1282569 00:06:09.663 19:04:28 -- common/autotest_common.sh@936 -- # '[' -z 1282569 ']' 00:06:09.663 19:04:28 -- common/autotest_common.sh@940 -- # kill -0 1282569 00:06:09.663 19:04:28 -- common/autotest_common.sh@941 -- # uname 00:06:09.663 19:04:28 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:09.663 19:04:28 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1282569 00:06:09.663 19:04:28 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:09.663 19:04:28 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:09.663 19:04:28 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1282569' 00:06:09.663 killing process with pid 1282569 00:06:09.663 19:04:28 -- common/autotest_common.sh@955 -- # kill 1282569 00:06:09.663 19:04:28 -- common/autotest_common.sh@960 -- # wait 1282569 00:06:09.922 19:04:28 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1282569 00:06:09.922 19:04:28 -- common/autotest_common.sh@650 -- # local es=0 00:06:09.922 19:04:28 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1282569 00:06:09.922 19:04:28 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:09.922 19:04:28 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:09.922 19:04:28 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:09.922 19:04:28 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:09.922 19:04:28 -- common/autotest_common.sh@653 -- # waitforlisten 1282569 00:06:09.922 19:04:28 -- common/autotest_common.sh@829 -- # '[' -z 1282569 ']' 00:06:09.922 19:04:28 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:09.922 19:04:28 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:09.922 19:04:28 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:09.922 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:09.922 19:04:28 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:09.922 19:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:09.922 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1282569) - No such process 00:06:09.922 ERROR: process (pid: 1282569) is no longer running 00:06:09.922 19:04:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:09.922 19:04:28 -- common/autotest_common.sh@862 -- # return 1 00:06:09.922 19:04:28 -- common/autotest_common.sh@653 -- # es=1 00:06:09.922 19:04:28 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:09.922 19:04:28 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:09.922 19:04:28 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:09.922 19:04:28 -- event/cpu_locks.sh@54 -- # no_locks 00:06:09.922 19:04:28 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:09.922 19:04:28 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:09.922 19:04:28 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:09.922 00:06:09.922 real 0m1.881s 00:06:09.922 user 0m1.983s 00:06:09.922 sys 0m0.661s 00:06:09.922 19:04:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:09.922 19:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:09.922 ************************************ 00:06:09.922 END TEST default_locks 00:06:09.922 ************************************ 00:06:09.922 19:04:28 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:09.922 19:04:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:09.922 19:04:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:09.922 19:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:09.922 ************************************ 00:06:09.922 START TEST default_locks_via_rpc 00:06:09.922 ************************************ 00:06:09.922 19:04:28 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:09.922 19:04:28 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1282876 00:06:09.922 19:04:28 -- event/cpu_locks.sh@63 -- # waitforlisten 1282876 00:06:09.922 19:04:28 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:09.922 19:04:28 -- common/autotest_common.sh@829 -- # '[' -z 1282876 ']' 00:06:09.922 19:04:28 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:09.922 19:04:28 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:09.922 19:04:28 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:09.922 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:09.922 19:04:28 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:09.922 19:04:28 -- common/autotest_common.sh@10 -- # set +x 00:06:09.922 [2024-11-18 19:04:28.457052] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:09.922 [2024-11-18 19:04:28.457146] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1282876 ] 00:06:09.922 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.181 [2024-11-18 19:04:28.525093] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.181 [2024-11-18 19:04:28.589992] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:10.181 [2024-11-18 19:04:28.590097] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.758 19:04:29 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:10.758 19:04:29 -- common/autotest_common.sh@862 -- # return 0 00:06:10.758 19:04:29 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:10.758 19:04:29 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.758 19:04:29 -- common/autotest_common.sh@10 -- # set +x 00:06:10.758 19:04:29 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.758 19:04:29 -- event/cpu_locks.sh@67 -- # no_locks 00:06:10.758 19:04:29 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:10.758 19:04:29 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:10.758 19:04:29 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:10.758 19:04:29 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:10.758 19:04:29 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:10.758 19:04:29 -- common/autotest_common.sh@10 -- # set +x 00:06:10.758 19:04:29 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:10.758 19:04:29 -- event/cpu_locks.sh@71 -- # locks_exist 1282876 00:06:10.758 19:04:29 -- event/cpu_locks.sh@22 -- # lslocks -p 1282876 00:06:10.758 19:04:29 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:11.328 19:04:29 -- event/cpu_locks.sh@73 -- # killprocess 1282876 00:06:11.328 19:04:29 -- common/autotest_common.sh@936 -- # '[' -z 1282876 ']' 00:06:11.329 19:04:29 -- common/autotest_common.sh@940 -- # kill -0 1282876 00:06:11.329 19:04:29 -- common/autotest_common.sh@941 -- # uname 00:06:11.329 19:04:29 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:11.329 19:04:29 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1282876 00:06:11.329 19:04:29 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:11.329 19:04:29 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:11.329 19:04:29 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1282876' 00:06:11.329 killing process with pid 1282876 00:06:11.329 19:04:29 -- common/autotest_common.sh@955 -- # kill 1282876 00:06:11.329 19:04:29 -- common/autotest_common.sh@960 -- # wait 1282876 00:06:11.588 00:06:11.588 real 0m1.732s 00:06:11.588 user 0m1.835s 00:06:11.588 sys 0m0.596s 00:06:11.588 19:04:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:11.588 19:04:30 -- common/autotest_common.sh@10 -- # set +x 00:06:11.588 ************************************ 00:06:11.588 END TEST default_locks_via_rpc 00:06:11.588 ************************************ 00:06:11.847 19:04:30 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:11.847 19:04:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:11.847 19:04:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:11.847 19:04:30 -- common/autotest_common.sh@10 -- # set +x 00:06:11.847 ************************************ 00:06:11.847 START TEST non_locking_app_on_locked_coremask 00:06:11.847 ************************************ 00:06:11.847 19:04:30 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:11.847 19:04:30 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1283191 00:06:11.847 19:04:30 -- event/cpu_locks.sh@81 -- # waitforlisten 1283191 /var/tmp/spdk.sock 00:06:11.847 19:04:30 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:11.847 19:04:30 -- common/autotest_common.sh@829 -- # '[' -z 1283191 ']' 00:06:11.847 19:04:30 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.847 19:04:30 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:11.847 19:04:30 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.847 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.847 19:04:30 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:11.847 19:04:30 -- common/autotest_common.sh@10 -- # set +x 00:06:11.847 [2024-11-18 19:04:30.238155] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:11.847 [2024-11-18 19:04:30.238246] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1283191 ] 00:06:11.847 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.847 [2024-11-18 19:04:30.307512] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.847 [2024-11-18 19:04:30.385453] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:11.847 [2024-11-18 19:04:30.385563] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.786 19:04:31 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:12.786 19:04:31 -- common/autotest_common.sh@862 -- # return 0 00:06:12.786 19:04:31 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1283442 00:06:12.786 19:04:31 -- event/cpu_locks.sh@85 -- # waitforlisten 1283442 /var/tmp/spdk2.sock 00:06:12.786 19:04:31 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:12.786 19:04:31 -- common/autotest_common.sh@829 -- # '[' -z 1283442 ']' 00:06:12.786 19:04:31 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:12.786 19:04:31 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.786 19:04:31 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:12.786 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:12.786 19:04:31 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.786 19:04:31 -- common/autotest_common.sh@10 -- # set +x 00:06:12.786 [2024-11-18 19:04:31.095932] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:12.786 [2024-11-18 19:04:31.096010] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1283442 ] 00:06:12.786 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.786 [2024-11-18 19:04:31.188958] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:12.786 [2024-11-18 19:04:31.188986] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.786 [2024-11-18 19:04:31.327361] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:12.786 [2024-11-18 19:04:31.327465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.354 19:04:31 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:13.354 19:04:31 -- common/autotest_common.sh@862 -- # return 0 00:06:13.354 19:04:31 -- event/cpu_locks.sh@87 -- # locks_exist 1283191 00:06:13.354 19:04:31 -- event/cpu_locks.sh@22 -- # lslocks -p 1283191 00:06:13.354 19:04:31 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:14.291 lslocks: write error 00:06:14.291 19:04:32 -- event/cpu_locks.sh@89 -- # killprocess 1283191 00:06:14.291 19:04:32 -- common/autotest_common.sh@936 -- # '[' -z 1283191 ']' 00:06:14.291 19:04:32 -- common/autotest_common.sh@940 -- # kill -0 1283191 00:06:14.291 19:04:32 -- common/autotest_common.sh@941 -- # uname 00:06:14.291 19:04:32 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:14.291 19:04:32 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1283191 00:06:14.291 19:04:32 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:14.291 19:04:32 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:14.291 19:04:32 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1283191' 00:06:14.291 killing process with pid 1283191 00:06:14.291 19:04:32 -- common/autotest_common.sh@955 -- # kill 1283191 00:06:14.291 19:04:32 -- common/autotest_common.sh@960 -- # wait 1283191 00:06:14.860 19:04:33 -- event/cpu_locks.sh@90 -- # killprocess 1283442 00:06:14.860 19:04:33 -- common/autotest_common.sh@936 -- # '[' -z 1283442 ']' 00:06:14.860 19:04:33 -- common/autotest_common.sh@940 -- # kill -0 1283442 00:06:14.860 19:04:33 -- common/autotest_common.sh@941 -- # uname 00:06:14.860 19:04:33 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:14.860 19:04:33 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1283442 00:06:15.119 19:04:33 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:15.119 19:04:33 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:15.119 19:04:33 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1283442' 00:06:15.119 killing process with pid 1283442 00:06:15.119 19:04:33 -- common/autotest_common.sh@955 -- # kill 1283442 00:06:15.119 19:04:33 -- common/autotest_common.sh@960 -- # wait 1283442 00:06:15.379 00:06:15.379 real 0m3.580s 00:06:15.379 user 0m3.826s 00:06:15.379 sys 0m1.161s 00:06:15.379 19:04:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:15.379 19:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:15.379 ************************************ 00:06:15.379 END TEST non_locking_app_on_locked_coremask 00:06:15.379 ************************************ 00:06:15.379 19:04:33 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:15.379 19:04:33 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:15.379 19:04:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:15.379 19:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:15.379 ************************************ 00:06:15.379 START TEST locking_app_on_unlocked_coremask 00:06:15.379 ************************************ 00:06:15.379 19:04:33 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:15.379 19:04:33 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1284008 00:06:15.379 19:04:33 -- event/cpu_locks.sh@99 -- # waitforlisten 1284008 /var/tmp/spdk.sock 00:06:15.379 19:04:33 -- common/autotest_common.sh@829 -- # '[' -z 1284008 ']' 00:06:15.379 19:04:33 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.379 19:04:33 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:15.379 19:04:33 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.379 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.379 19:04:33 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:15.379 19:04:33 -- common/autotest_common.sh@10 -- # set +x 00:06:15.379 19:04:33 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:15.379 [2024-11-18 19:04:33.862421] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:15.379 [2024-11-18 19:04:33.862501] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1284008 ] 00:06:15.379 EAL: No free 2048 kB hugepages reported on node 1 00:06:15.379 [2024-11-18 19:04:33.932488] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:15.379 [2024-11-18 19:04:33.932514] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.638 [2024-11-18 19:04:34.007783] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:15.638 [2024-11-18 19:04:34.007878] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.206 19:04:34 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:16.206 19:04:34 -- common/autotest_common.sh@862 -- # return 0 00:06:16.206 19:04:34 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1284027 00:06:16.206 19:04:34 -- event/cpu_locks.sh@103 -- # waitforlisten 1284027 /var/tmp/spdk2.sock 00:06:16.206 19:04:34 -- common/autotest_common.sh@829 -- # '[' -z 1284027 ']' 00:06:16.206 19:04:34 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:16.206 19:04:34 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.206 19:04:34 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:16.206 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:16.206 19:04:34 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.206 19:04:34 -- common/autotest_common.sh@10 -- # set +x 00:06:16.206 19:04:34 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:16.206 [2024-11-18 19:04:34.714869] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:16.206 [2024-11-18 19:04:34.714955] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1284027 ] 00:06:16.206 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.465 [2024-11-18 19:04:34.808760] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.465 [2024-11-18 19:04:34.945207] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.465 [2024-11-18 19:04:34.945322] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.032 19:04:35 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.032 19:04:35 -- common/autotest_common.sh@862 -- # return 0 00:06:17.032 19:04:35 -- event/cpu_locks.sh@105 -- # locks_exist 1284027 00:06:17.032 19:04:35 -- event/cpu_locks.sh@22 -- # lslocks -p 1284027 00:06:17.032 19:04:35 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:17.969 lslocks: write error 00:06:17.969 19:04:36 -- event/cpu_locks.sh@107 -- # killprocess 1284008 00:06:17.969 19:04:36 -- common/autotest_common.sh@936 -- # '[' -z 1284008 ']' 00:06:17.969 19:04:36 -- common/autotest_common.sh@940 -- # kill -0 1284008 00:06:17.969 19:04:36 -- common/autotest_common.sh@941 -- # uname 00:06:17.969 19:04:36 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:17.969 19:04:36 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1284008 00:06:17.969 19:04:36 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:17.969 19:04:36 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:17.969 19:04:36 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1284008' 00:06:17.970 killing process with pid 1284008 00:06:17.970 19:04:36 -- common/autotest_common.sh@955 -- # kill 1284008 00:06:17.970 19:04:36 -- common/autotest_common.sh@960 -- # wait 1284008 00:06:18.538 19:04:36 -- event/cpu_locks.sh@108 -- # killprocess 1284027 00:06:18.538 19:04:36 -- common/autotest_common.sh@936 -- # '[' -z 1284027 ']' 00:06:18.538 19:04:36 -- common/autotest_common.sh@940 -- # kill -0 1284027 00:06:18.538 19:04:36 -- common/autotest_common.sh@941 -- # uname 00:06:18.538 19:04:36 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.538 19:04:36 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1284027 00:06:18.538 19:04:36 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:18.538 19:04:37 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:18.538 19:04:37 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1284027' 00:06:18.538 killing process with pid 1284027 00:06:18.538 19:04:37 -- common/autotest_common.sh@955 -- # kill 1284027 00:06:18.538 19:04:37 -- common/autotest_common.sh@960 -- # wait 1284027 00:06:18.797 00:06:18.797 real 0m3.465s 00:06:18.797 user 0m3.723s 00:06:18.797 sys 0m1.082s 00:06:18.797 19:04:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.797 19:04:37 -- common/autotest_common.sh@10 -- # set +x 00:06:18.797 ************************************ 00:06:18.797 END TEST locking_app_on_unlocked_coremask 00:06:18.797 ************************************ 00:06:18.797 19:04:37 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:18.797 19:04:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.797 19:04:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.797 19:04:37 -- common/autotest_common.sh@10 -- # set +x 00:06:18.797 ************************************ 00:06:18.797 START TEST locking_app_on_locked_coremask 00:06:18.797 ************************************ 00:06:18.797 19:04:37 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:18.797 19:04:37 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1284600 00:06:18.797 19:04:37 -- event/cpu_locks.sh@116 -- # waitforlisten 1284600 /var/tmp/spdk.sock 00:06:18.797 19:04:37 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:18.797 19:04:37 -- common/autotest_common.sh@829 -- # '[' -z 1284600 ']' 00:06:18.797 19:04:37 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.797 19:04:37 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:18.797 19:04:37 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.797 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.797 19:04:37 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:18.797 19:04:37 -- common/autotest_common.sh@10 -- # set +x 00:06:18.797 [2024-11-18 19:04:37.378222] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:18.797 [2024-11-18 19:04:37.378295] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1284600 ] 00:06:19.056 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.056 [2024-11-18 19:04:37.445048] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.056 [2024-11-18 19:04:37.509370] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:19.056 [2024-11-18 19:04:37.509490] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.622 19:04:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:19.622 19:04:38 -- common/autotest_common.sh@862 -- # return 0 00:06:19.622 19:04:38 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1284807 00:06:19.622 19:04:38 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1284807 /var/tmp/spdk2.sock 00:06:19.622 19:04:38 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:19.622 19:04:38 -- common/autotest_common.sh@650 -- # local es=0 00:06:19.622 19:04:38 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1284807 /var/tmp/spdk2.sock 00:06:19.622 19:04:38 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:19.622 19:04:38 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:19.622 19:04:38 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:19.622 19:04:38 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:19.622 19:04:38 -- common/autotest_common.sh@653 -- # waitforlisten 1284807 /var/tmp/spdk2.sock 00:06:19.622 19:04:38 -- common/autotest_common.sh@829 -- # '[' -z 1284807 ']' 00:06:19.622 19:04:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:19.622 19:04:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.622 19:04:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:19.622 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:19.622 19:04:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.622 19:04:38 -- common/autotest_common.sh@10 -- # set +x 00:06:19.622 [2024-11-18 19:04:38.223712] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:19.622 [2024-11-18 19:04:38.223796] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1284807 ] 00:06:19.881 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.882 [2024-11-18 19:04:38.318853] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1284600 has claimed it. 00:06:19.882 [2024-11-18 19:04:38.318894] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:20.450 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1284807) - No such process 00:06:20.450 ERROR: process (pid: 1284807) is no longer running 00:06:20.450 19:04:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:20.450 19:04:38 -- common/autotest_common.sh@862 -- # return 1 00:06:20.450 19:04:38 -- common/autotest_common.sh@653 -- # es=1 00:06:20.450 19:04:38 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:20.450 19:04:38 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:20.450 19:04:38 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:20.450 19:04:38 -- event/cpu_locks.sh@122 -- # locks_exist 1284600 00:06:20.450 19:04:38 -- event/cpu_locks.sh@22 -- # lslocks -p 1284600 00:06:20.450 19:04:38 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:21.018 lslocks: write error 00:06:21.018 19:04:39 -- event/cpu_locks.sh@124 -- # killprocess 1284600 00:06:21.018 19:04:39 -- common/autotest_common.sh@936 -- # '[' -z 1284600 ']' 00:06:21.018 19:04:39 -- common/autotest_common.sh@940 -- # kill -0 1284600 00:06:21.018 19:04:39 -- common/autotest_common.sh@941 -- # uname 00:06:21.018 19:04:39 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:21.018 19:04:39 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1284600 00:06:21.018 19:04:39 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:21.018 19:04:39 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:21.018 19:04:39 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1284600' 00:06:21.018 killing process with pid 1284600 00:06:21.018 19:04:39 -- common/autotest_common.sh@955 -- # kill 1284600 00:06:21.018 19:04:39 -- common/autotest_common.sh@960 -- # wait 1284600 00:06:21.277 00:06:21.277 real 0m2.380s 00:06:21.277 user 0m2.629s 00:06:21.277 sys 0m0.708s 00:06:21.277 19:04:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:21.277 19:04:39 -- common/autotest_common.sh@10 -- # set +x 00:06:21.277 ************************************ 00:06:21.277 END TEST locking_app_on_locked_coremask 00:06:21.277 ************************************ 00:06:21.277 19:04:39 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:21.277 19:04:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:21.277 19:04:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.277 19:04:39 -- common/autotest_common.sh@10 -- # set +x 00:06:21.277 ************************************ 00:06:21.277 START TEST locking_overlapped_coremask 00:06:21.277 ************************************ 00:06:21.277 19:04:39 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:21.277 19:04:39 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1285124 00:06:21.277 19:04:39 -- event/cpu_locks.sh@133 -- # waitforlisten 1285124 /var/tmp/spdk.sock 00:06:21.277 19:04:39 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:21.277 19:04:39 -- common/autotest_common.sh@829 -- # '[' -z 1285124 ']' 00:06:21.277 19:04:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.277 19:04:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:21.277 19:04:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.277 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.277 19:04:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:21.277 19:04:39 -- common/autotest_common.sh@10 -- # set +x 00:06:21.277 [2024-11-18 19:04:39.807373] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:21.277 [2024-11-18 19:04:39.807445] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1285124 ] 00:06:21.277 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.277 [2024-11-18 19:04:39.877032] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:21.536 [2024-11-18 19:04:39.953446] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:21.536 [2024-11-18 19:04:39.953577] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.536 [2024-11-18 19:04:39.953674] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.536 [2024-11-18 19:04:39.953675] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:22.104 19:04:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:22.104 19:04:40 -- common/autotest_common.sh@862 -- # return 0 00:06:22.104 19:04:40 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1285184 00:06:22.104 19:04:40 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1285184 /var/tmp/spdk2.sock 00:06:22.104 19:04:40 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:22.104 19:04:40 -- common/autotest_common.sh@650 -- # local es=0 00:06:22.104 19:04:40 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1285184 /var/tmp/spdk2.sock 00:06:22.104 19:04:40 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:22.104 19:04:40 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.104 19:04:40 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:22.104 19:04:40 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:22.104 19:04:40 -- common/autotest_common.sh@653 -- # waitforlisten 1285184 /var/tmp/spdk2.sock 00:06:22.104 19:04:40 -- common/autotest_common.sh@829 -- # '[' -z 1285184 ']' 00:06:22.104 19:04:40 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:22.104 19:04:40 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:22.104 19:04:40 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:22.104 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:22.104 19:04:40 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:22.104 19:04:40 -- common/autotest_common.sh@10 -- # set +x 00:06:22.104 [2024-11-18 19:04:40.682890] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:22.104 [2024-11-18 19:04:40.682975] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1285184 ] 00:06:22.363 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.363 [2024-11-18 19:04:40.776590] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1285124 has claimed it. 00:06:22.363 [2024-11-18 19:04:40.776624] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:22.931 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1285184) - No such process 00:06:22.931 ERROR: process (pid: 1285184) is no longer running 00:06:22.931 19:04:41 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:22.931 19:04:41 -- common/autotest_common.sh@862 -- # return 1 00:06:22.931 19:04:41 -- common/autotest_common.sh@653 -- # es=1 00:06:22.931 19:04:41 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:22.931 19:04:41 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:22.931 19:04:41 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:22.931 19:04:41 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:22.931 19:04:41 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:22.931 19:04:41 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:22.931 19:04:41 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:22.931 19:04:41 -- event/cpu_locks.sh@141 -- # killprocess 1285124 00:06:22.931 19:04:41 -- common/autotest_common.sh@936 -- # '[' -z 1285124 ']' 00:06:22.931 19:04:41 -- common/autotest_common.sh@940 -- # kill -0 1285124 00:06:22.931 19:04:41 -- common/autotest_common.sh@941 -- # uname 00:06:22.931 19:04:41 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:22.931 19:04:41 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1285124 00:06:22.931 19:04:41 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:22.931 19:04:41 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:22.931 19:04:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1285124' 00:06:22.931 killing process with pid 1285124 00:06:22.931 19:04:41 -- common/autotest_common.sh@955 -- # kill 1285124 00:06:22.931 19:04:41 -- common/autotest_common.sh@960 -- # wait 1285124 00:06:23.191 00:06:23.191 real 0m1.925s 00:06:23.191 user 0m5.463s 00:06:23.191 sys 0m0.459s 00:06:23.191 19:04:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:23.191 19:04:41 -- common/autotest_common.sh@10 -- # set +x 00:06:23.191 ************************************ 00:06:23.191 END TEST locking_overlapped_coremask 00:06:23.191 ************************************ 00:06:23.191 19:04:41 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:23.191 19:04:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:23.191 19:04:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:23.191 19:04:41 -- common/autotest_common.sh@10 -- # set +x 00:06:23.191 ************************************ 00:06:23.191 START TEST locking_overlapped_coremask_via_rpc 00:06:23.191 ************************************ 00:06:23.191 19:04:41 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:23.191 19:04:41 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1285474 00:06:23.191 19:04:41 -- event/cpu_locks.sh@149 -- # waitforlisten 1285474 /var/tmp/spdk.sock 00:06:23.191 19:04:41 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:23.191 19:04:41 -- common/autotest_common.sh@829 -- # '[' -z 1285474 ']' 00:06:23.191 19:04:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.191 19:04:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:23.191 19:04:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.191 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.191 19:04:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:23.191 19:04:41 -- common/autotest_common.sh@10 -- # set +x 00:06:23.191 [2024-11-18 19:04:41.783385] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:23.191 [2024-11-18 19:04:41.783463] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1285474 ] 00:06:23.450 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.450 [2024-11-18 19:04:41.854303] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:23.450 [2024-11-18 19:04:41.854327] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:23.450 [2024-11-18 19:04:41.929846] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:23.450 [2024-11-18 19:04:41.929970] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.450 [2024-11-18 19:04:41.930066] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:23.450 [2024-11-18 19:04:41.930068] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.386 19:04:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.386 19:04:42 -- common/autotest_common.sh@862 -- # return 0 00:06:24.386 19:04:42 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1285594 00:06:24.386 19:04:42 -- event/cpu_locks.sh@153 -- # waitforlisten 1285594 /var/tmp/spdk2.sock 00:06:24.386 19:04:42 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:24.386 19:04:42 -- common/autotest_common.sh@829 -- # '[' -z 1285594 ']' 00:06:24.386 19:04:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:24.386 19:04:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:24.386 19:04:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:24.386 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:24.386 19:04:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:24.386 19:04:42 -- common/autotest_common.sh@10 -- # set +x 00:06:24.386 [2024-11-18 19:04:42.657705] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:24.386 [2024-11-18 19:04:42.657768] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1285594 ] 00:06:24.386 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.386 [2024-11-18 19:04:42.753025] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:24.386 [2024-11-18 19:04:42.753046] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:24.386 [2024-11-18 19:04:42.896752] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:24.386 [2024-11-18 19:04:42.896902] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:24.386 [2024-11-18 19:04:42.897036] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:24.386 [2024-11-18 19:04:42.897038] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:24.957 19:04:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.957 19:04:43 -- common/autotest_common.sh@862 -- # return 0 00:06:24.957 19:04:43 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:24.957 19:04:43 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:24.957 19:04:43 -- common/autotest_common.sh@10 -- # set +x 00:06:24.957 19:04:43 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:24.957 19:04:43 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:24.957 19:04:43 -- common/autotest_common.sh@650 -- # local es=0 00:06:24.957 19:04:43 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:24.957 19:04:43 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:24.957 19:04:43 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.957 19:04:43 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:24.957 19:04:43 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.957 19:04:43 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:24.957 19:04:43 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:24.957 19:04:43 -- common/autotest_common.sh@10 -- # set +x 00:06:24.957 [2024-11-18 19:04:43.521614] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1285474 has claimed it. 00:06:24.957 request: 00:06:24.957 { 00:06:24.957 "method": "framework_enable_cpumask_locks", 00:06:24.957 "req_id": 1 00:06:24.957 } 00:06:24.957 Got JSON-RPC error response 00:06:24.957 response: 00:06:24.957 { 00:06:24.957 "code": -32603, 00:06:24.957 "message": "Failed to claim CPU core: 2" 00:06:24.957 } 00:06:24.957 19:04:43 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:24.957 19:04:43 -- common/autotest_common.sh@653 -- # es=1 00:06:24.957 19:04:43 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:24.957 19:04:43 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:24.957 19:04:43 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:24.957 19:04:43 -- event/cpu_locks.sh@158 -- # waitforlisten 1285474 /var/tmp/spdk.sock 00:06:24.957 19:04:43 -- common/autotest_common.sh@829 -- # '[' -z 1285474 ']' 00:06:24.957 19:04:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:24.957 19:04:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:24.957 19:04:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:24.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:24.957 19:04:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:24.957 19:04:43 -- common/autotest_common.sh@10 -- # set +x 00:06:25.217 19:04:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.217 19:04:43 -- common/autotest_common.sh@862 -- # return 0 00:06:25.217 19:04:43 -- event/cpu_locks.sh@159 -- # waitforlisten 1285594 /var/tmp/spdk2.sock 00:06:25.217 19:04:43 -- common/autotest_common.sh@829 -- # '[' -z 1285594 ']' 00:06:25.217 19:04:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:25.217 19:04:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.217 19:04:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:25.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:25.217 19:04:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.217 19:04:43 -- common/autotest_common.sh@10 -- # set +x 00:06:25.476 19:04:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.476 19:04:43 -- common/autotest_common.sh@862 -- # return 0 00:06:25.476 19:04:43 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:25.476 19:04:43 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:25.476 19:04:43 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:25.476 19:04:43 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:25.476 00:06:25.476 real 0m2.144s 00:06:25.476 user 0m0.894s 00:06:25.476 sys 0m0.185s 00:06:25.476 19:04:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:25.476 19:04:43 -- common/autotest_common.sh@10 -- # set +x 00:06:25.476 ************************************ 00:06:25.476 END TEST locking_overlapped_coremask_via_rpc 00:06:25.476 ************************************ 00:06:25.476 19:04:43 -- event/cpu_locks.sh@174 -- # cleanup 00:06:25.476 19:04:43 -- event/cpu_locks.sh@15 -- # [[ -z 1285474 ]] 00:06:25.476 19:04:43 -- event/cpu_locks.sh@15 -- # killprocess 1285474 00:06:25.476 19:04:43 -- common/autotest_common.sh@936 -- # '[' -z 1285474 ']' 00:06:25.476 19:04:43 -- common/autotest_common.sh@940 -- # kill -0 1285474 00:06:25.476 19:04:43 -- common/autotest_common.sh@941 -- # uname 00:06:25.476 19:04:43 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:25.476 19:04:43 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1285474 00:06:25.476 19:04:44 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:25.476 19:04:44 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:25.476 19:04:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1285474' 00:06:25.476 killing process with pid 1285474 00:06:25.476 19:04:44 -- common/autotest_common.sh@955 -- # kill 1285474 00:06:25.476 19:04:44 -- common/autotest_common.sh@960 -- # wait 1285474 00:06:25.735 19:04:44 -- event/cpu_locks.sh@16 -- # [[ -z 1285594 ]] 00:06:25.735 19:04:44 -- event/cpu_locks.sh@16 -- # killprocess 1285594 00:06:25.735 19:04:44 -- common/autotest_common.sh@936 -- # '[' -z 1285594 ']' 00:06:25.735 19:04:44 -- common/autotest_common.sh@940 -- # kill -0 1285594 00:06:25.735 19:04:44 -- common/autotest_common.sh@941 -- # uname 00:06:25.735 19:04:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:25.735 19:04:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1285594 00:06:25.995 19:04:44 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:25.995 19:04:44 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:25.995 19:04:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1285594' 00:06:25.995 killing process with pid 1285594 00:06:25.995 19:04:44 -- common/autotest_common.sh@955 -- # kill 1285594 00:06:25.995 19:04:44 -- common/autotest_common.sh@960 -- # wait 1285594 00:06:26.253 19:04:44 -- event/cpu_locks.sh@18 -- # rm -f 00:06:26.253 19:04:44 -- event/cpu_locks.sh@1 -- # cleanup 00:06:26.253 19:04:44 -- event/cpu_locks.sh@15 -- # [[ -z 1285474 ]] 00:06:26.253 19:04:44 -- event/cpu_locks.sh@15 -- # killprocess 1285474 00:06:26.253 19:04:44 -- common/autotest_common.sh@936 -- # '[' -z 1285474 ']' 00:06:26.253 19:04:44 -- common/autotest_common.sh@940 -- # kill -0 1285474 00:06:26.253 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1285474) - No such process 00:06:26.253 19:04:44 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1285474 is not found' 00:06:26.253 Process with pid 1285474 is not found 00:06:26.253 19:04:44 -- event/cpu_locks.sh@16 -- # [[ -z 1285594 ]] 00:06:26.253 19:04:44 -- event/cpu_locks.sh@16 -- # killprocess 1285594 00:06:26.253 19:04:44 -- common/autotest_common.sh@936 -- # '[' -z 1285594 ']' 00:06:26.253 19:04:44 -- common/autotest_common.sh@940 -- # kill -0 1285594 00:06:26.253 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1285594) - No such process 00:06:26.253 19:04:44 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1285594 is not found' 00:06:26.253 Process with pid 1285594 is not found 00:06:26.253 19:04:44 -- event/cpu_locks.sh@18 -- # rm -f 00:06:26.253 00:06:26.254 real 0m18.385s 00:06:26.254 user 0m31.225s 00:06:26.254 sys 0m5.810s 00:06:26.254 19:04:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:26.254 19:04:44 -- common/autotest_common.sh@10 -- # set +x 00:06:26.254 ************************************ 00:06:26.254 END TEST cpu_locks 00:06:26.254 ************************************ 00:06:26.254 00:06:26.254 real 0m44.075s 00:06:26.254 user 1m23.385s 00:06:26.254 sys 0m9.803s 00:06:26.254 19:04:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:26.254 19:04:44 -- common/autotest_common.sh@10 -- # set +x 00:06:26.254 ************************************ 00:06:26.254 END TEST event 00:06:26.254 ************************************ 00:06:26.254 19:04:44 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:26.254 19:04:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:26.254 19:04:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:26.254 19:04:44 -- common/autotest_common.sh@10 -- # set +x 00:06:26.254 ************************************ 00:06:26.254 START TEST thread 00:06:26.254 ************************************ 00:06:26.254 19:04:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:26.513 * Looking for test storage... 00:06:26.513 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:26.513 19:04:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:26.513 19:04:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:26.513 19:04:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:26.513 19:04:44 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:26.513 19:04:44 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:26.513 19:04:44 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:26.513 19:04:44 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:26.513 19:04:44 -- scripts/common.sh@335 -- # IFS=.-: 00:06:26.513 19:04:44 -- scripts/common.sh@335 -- # read -ra ver1 00:06:26.513 19:04:44 -- scripts/common.sh@336 -- # IFS=.-: 00:06:26.513 19:04:44 -- scripts/common.sh@336 -- # read -ra ver2 00:06:26.513 19:04:44 -- scripts/common.sh@337 -- # local 'op=<' 00:06:26.513 19:04:44 -- scripts/common.sh@339 -- # ver1_l=2 00:06:26.513 19:04:44 -- scripts/common.sh@340 -- # ver2_l=1 00:06:26.513 19:04:44 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:26.513 19:04:44 -- scripts/common.sh@343 -- # case "$op" in 00:06:26.513 19:04:44 -- scripts/common.sh@344 -- # : 1 00:06:26.513 19:04:44 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:26.513 19:04:44 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:26.513 19:04:44 -- scripts/common.sh@364 -- # decimal 1 00:06:26.513 19:04:44 -- scripts/common.sh@352 -- # local d=1 00:06:26.513 19:04:44 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:26.513 19:04:44 -- scripts/common.sh@354 -- # echo 1 00:06:26.513 19:04:44 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:26.513 19:04:44 -- scripts/common.sh@365 -- # decimal 2 00:06:26.513 19:04:44 -- scripts/common.sh@352 -- # local d=2 00:06:26.513 19:04:44 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:26.513 19:04:44 -- scripts/common.sh@354 -- # echo 2 00:06:26.513 19:04:44 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:26.513 19:04:44 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:26.513 19:04:44 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:26.513 19:04:44 -- scripts/common.sh@367 -- # return 0 00:06:26.513 19:04:44 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:26.513 19:04:44 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:26.513 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.513 --rc genhtml_branch_coverage=1 00:06:26.513 --rc genhtml_function_coverage=1 00:06:26.513 --rc genhtml_legend=1 00:06:26.513 --rc geninfo_all_blocks=1 00:06:26.513 --rc geninfo_unexecuted_blocks=1 00:06:26.513 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.513 ' 00:06:26.513 19:04:44 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:26.513 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.513 --rc genhtml_branch_coverage=1 00:06:26.513 --rc genhtml_function_coverage=1 00:06:26.513 --rc genhtml_legend=1 00:06:26.513 --rc geninfo_all_blocks=1 00:06:26.513 --rc geninfo_unexecuted_blocks=1 00:06:26.513 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.513 ' 00:06:26.513 19:04:44 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:26.513 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.514 --rc genhtml_branch_coverage=1 00:06:26.514 --rc genhtml_function_coverage=1 00:06:26.514 --rc genhtml_legend=1 00:06:26.514 --rc geninfo_all_blocks=1 00:06:26.514 --rc geninfo_unexecuted_blocks=1 00:06:26.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.514 ' 00:06:26.514 19:04:44 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:26.514 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:26.514 --rc genhtml_branch_coverage=1 00:06:26.514 --rc genhtml_function_coverage=1 00:06:26.514 --rc genhtml_legend=1 00:06:26.514 --rc geninfo_all_blocks=1 00:06:26.514 --rc geninfo_unexecuted_blocks=1 00:06:26.514 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:26.514 ' 00:06:26.514 19:04:44 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:26.514 19:04:44 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:26.514 19:04:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:26.514 19:04:44 -- common/autotest_common.sh@10 -- # set +x 00:06:26.514 ************************************ 00:06:26.514 START TEST thread_poller_perf 00:06:26.514 ************************************ 00:06:26.514 19:04:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:26.514 [2024-11-18 19:04:44.988807] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:26.514 [2024-11-18 19:04:44.988923] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1286120 ] 00:06:26.514 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.514 [2024-11-18 19:04:45.060926] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.773 [2024-11-18 19:04:45.131920] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.773 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:27.710 [2024-11-18T18:04:46.314Z] ====================================== 00:06:27.710 [2024-11-18T18:04:46.314Z] busy:2506124630 (cyc) 00:06:27.710 [2024-11-18T18:04:46.314Z] total_run_count: 797000 00:06:27.710 [2024-11-18T18:04:46.314Z] tsc_hz: 2500000000 (cyc) 00:06:27.710 [2024-11-18T18:04:46.314Z] ====================================== 00:06:27.710 [2024-11-18T18:04:46.314Z] poller_cost: 3144 (cyc), 1257 (nsec) 00:06:27.710 00:06:27.710 real 0m1.228s 00:06:27.710 user 0m1.136s 00:06:27.710 sys 0m0.088s 00:06:27.710 19:04:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:27.710 19:04:46 -- common/autotest_common.sh@10 -- # set +x 00:06:27.710 ************************************ 00:06:27.710 END TEST thread_poller_perf 00:06:27.710 ************************************ 00:06:27.710 19:04:46 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:27.710 19:04:46 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:27.710 19:04:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:27.710 19:04:46 -- common/autotest_common.sh@10 -- # set +x 00:06:27.710 ************************************ 00:06:27.710 START TEST thread_poller_perf 00:06:27.710 ************************************ 00:06:27.710 19:04:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:27.710 [2024-11-18 19:04:46.267598] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:27.710 [2024-11-18 19:04:46.267694] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1286402 ] 00:06:27.710 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.970 [2024-11-18 19:04:46.339916] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.970 [2024-11-18 19:04:46.409275] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.970 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:28.907 [2024-11-18T18:04:47.511Z] ====================================== 00:06:28.907 [2024-11-18T18:04:47.511Z] busy:2501876286 (cyc) 00:06:28.907 [2024-11-18T18:04:47.511Z] total_run_count: 13443000 00:06:28.907 [2024-11-18T18:04:47.511Z] tsc_hz: 2500000000 (cyc) 00:06:28.907 [2024-11-18T18:04:47.511Z] ====================================== 00:06:28.907 [2024-11-18T18:04:47.511Z] poller_cost: 186 (cyc), 74 (nsec) 00:06:28.907 00:06:28.907 real 0m1.225s 00:06:28.907 user 0m1.133s 00:06:28.907 sys 0m0.088s 00:06:28.907 19:04:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:28.907 19:04:47 -- common/autotest_common.sh@10 -- # set +x 00:06:28.908 ************************************ 00:06:28.908 END TEST thread_poller_perf 00:06:28.908 ************************************ 00:06:29.167 19:04:47 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:29.167 19:04:47 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:29.167 19:04:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:29.167 19:04:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.167 19:04:47 -- common/autotest_common.sh@10 -- # set +x 00:06:29.167 ************************************ 00:06:29.167 START TEST thread_spdk_lock 00:06:29.167 ************************************ 00:06:29.167 19:04:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:29.167 [2024-11-18 19:04:47.527210] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:29.167 [2024-11-18 19:04:47.527271] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1286675 ] 00:06:29.167 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.167 [2024-11-18 19:04:47.589387] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:29.167 [2024-11-18 19:04:47.657943] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:29.167 [2024-11-18 19:04:47.657945] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.735 [2024-11-18 19:04:48.143851] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.735 [2024-11-18 19:04:48.143888] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:29.735 [2024-11-18 19:04:48.143898] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x1483c80 00:06:29.735 [2024-11-18 19:04:48.144809] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.735 [2024-11-18 19:04:48.144913] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.735 [2024-11-18 19:04:48.144931] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:29.735 Starting test contend 00:06:29.735 Worker Delay Wait us Hold us Total us 00:06:29.735 0 3 168851 183939 352791 00:06:29.735 1 5 87566 284508 372075 00:06:29.735 PASS test contend 00:06:29.735 Starting test hold_by_poller 00:06:29.735 PASS test hold_by_poller 00:06:29.735 Starting test hold_by_message 00:06:29.735 PASS test hold_by_message 00:06:29.735 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:29.735 100014 assertions passed 00:06:29.735 0 assertions failed 00:06:29.736 00:06:29.736 real 0m0.687s 00:06:29.736 user 0m1.093s 00:06:29.736 sys 0m0.078s 00:06:29.736 19:04:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.736 19:04:48 -- common/autotest_common.sh@10 -- # set +x 00:06:29.736 ************************************ 00:06:29.736 END TEST thread_spdk_lock 00:06:29.736 ************************************ 00:06:29.736 00:06:29.736 real 0m3.453s 00:06:29.736 user 0m3.518s 00:06:29.736 sys 0m0.449s 00:06:29.736 19:04:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.736 19:04:48 -- common/autotest_common.sh@10 -- # set +x 00:06:29.736 ************************************ 00:06:29.736 END TEST thread 00:06:29.736 ************************************ 00:06:29.736 19:04:48 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:29.736 19:04:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:29.736 19:04:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.736 19:04:48 -- common/autotest_common.sh@10 -- # set +x 00:06:29.736 ************************************ 00:06:29.736 START TEST accel 00:06:29.736 ************************************ 00:06:29.736 19:04:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:29.995 * Looking for test storage... 00:06:29.995 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:29.995 19:04:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:29.995 19:04:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:29.995 19:04:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:29.995 19:04:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:29.995 19:04:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:29.995 19:04:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:29.995 19:04:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:29.995 19:04:48 -- scripts/common.sh@335 -- # IFS=.-: 00:06:29.995 19:04:48 -- scripts/common.sh@335 -- # read -ra ver1 00:06:29.995 19:04:48 -- scripts/common.sh@336 -- # IFS=.-: 00:06:29.995 19:04:48 -- scripts/common.sh@336 -- # read -ra ver2 00:06:29.995 19:04:48 -- scripts/common.sh@337 -- # local 'op=<' 00:06:29.995 19:04:48 -- scripts/common.sh@339 -- # ver1_l=2 00:06:29.995 19:04:48 -- scripts/common.sh@340 -- # ver2_l=1 00:06:29.995 19:04:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:29.995 19:04:48 -- scripts/common.sh@343 -- # case "$op" in 00:06:29.995 19:04:48 -- scripts/common.sh@344 -- # : 1 00:06:29.995 19:04:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:29.995 19:04:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:29.995 19:04:48 -- scripts/common.sh@364 -- # decimal 1 00:06:29.995 19:04:48 -- scripts/common.sh@352 -- # local d=1 00:06:29.995 19:04:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:29.995 19:04:48 -- scripts/common.sh@354 -- # echo 1 00:06:29.995 19:04:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:29.995 19:04:48 -- scripts/common.sh@365 -- # decimal 2 00:06:29.995 19:04:48 -- scripts/common.sh@352 -- # local d=2 00:06:29.995 19:04:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:29.995 19:04:48 -- scripts/common.sh@354 -- # echo 2 00:06:29.995 19:04:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:29.995 19:04:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:29.995 19:04:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:29.996 19:04:48 -- scripts/common.sh@367 -- # return 0 00:06:29.996 19:04:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:29.996 19:04:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:29.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:29.996 --rc genhtml_branch_coverage=1 00:06:29.996 --rc genhtml_function_coverage=1 00:06:29.996 --rc genhtml_legend=1 00:06:29.996 --rc geninfo_all_blocks=1 00:06:29.996 --rc geninfo_unexecuted_blocks=1 00:06:29.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:29.996 ' 00:06:29.996 19:04:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:29.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:29.996 --rc genhtml_branch_coverage=1 00:06:29.996 --rc genhtml_function_coverage=1 00:06:29.996 --rc genhtml_legend=1 00:06:29.996 --rc geninfo_all_blocks=1 00:06:29.996 --rc geninfo_unexecuted_blocks=1 00:06:29.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:29.996 ' 00:06:29.996 19:04:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:29.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:29.996 --rc genhtml_branch_coverage=1 00:06:29.996 --rc genhtml_function_coverage=1 00:06:29.996 --rc genhtml_legend=1 00:06:29.996 --rc geninfo_all_blocks=1 00:06:29.996 --rc geninfo_unexecuted_blocks=1 00:06:29.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:29.996 ' 00:06:29.996 19:04:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:29.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:29.996 --rc genhtml_branch_coverage=1 00:06:29.996 --rc genhtml_function_coverage=1 00:06:29.996 --rc genhtml_legend=1 00:06:29.996 --rc geninfo_all_blocks=1 00:06:29.996 --rc geninfo_unexecuted_blocks=1 00:06:29.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:29.996 ' 00:06:29.996 19:04:48 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:29.996 19:04:48 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:29.996 19:04:48 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:29.996 19:04:48 -- accel/accel.sh@59 -- # spdk_tgt_pid=1286774 00:06:29.996 19:04:48 -- accel/accel.sh@60 -- # waitforlisten 1286774 00:06:29.996 19:04:48 -- common/autotest_common.sh@829 -- # '[' -z 1286774 ']' 00:06:29.996 19:04:48 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:29.996 19:04:48 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:29.996 19:04:48 -- accel/accel.sh@58 -- # build_accel_config 00:06:29.996 19:04:48 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:29.996 19:04:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.996 19:04:48 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:29.996 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:29.996 19:04:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.996 19:04:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.996 19:04:48 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:29.996 19:04:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.996 19:04:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.996 19:04:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.996 19:04:48 -- common/autotest_common.sh@10 -- # set +x 00:06:29.996 19:04:48 -- accel/accel.sh@42 -- # jq -r . 00:06:29.996 [2024-11-18 19:04:48.494524] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:29.996 [2024-11-18 19:04:48.494598] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1286774 ] 00:06:29.996 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.996 [2024-11-18 19:04:48.560887] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.256 [2024-11-18 19:04:48.634094] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.256 [2024-11-18 19:04:48.634194] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.822 19:04:49 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:30.822 19:04:49 -- common/autotest_common.sh@862 -- # return 0 00:06:30.822 19:04:49 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:30.822 19:04:49 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:30.822 19:04:49 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:30.822 19:04:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.822 19:04:49 -- common/autotest_common.sh@10 -- # set +x 00:06:30.822 19:04:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # IFS== 00:06:30.822 19:04:49 -- accel/accel.sh@64 -- # read -r opc module 00:06:30.822 19:04:49 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:30.822 19:04:49 -- accel/accel.sh@67 -- # killprocess 1286774 00:06:30.822 19:04:49 -- common/autotest_common.sh@936 -- # '[' -z 1286774 ']' 00:06:30.822 19:04:49 -- common/autotest_common.sh@940 -- # kill -0 1286774 00:06:30.822 19:04:49 -- common/autotest_common.sh@941 -- # uname 00:06:30.822 19:04:49 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:30.823 19:04:49 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1286774 00:06:30.823 19:04:49 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:30.823 19:04:49 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:30.823 19:04:49 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1286774' 00:06:30.823 killing process with pid 1286774 00:06:30.823 19:04:49 -- common/autotest_common.sh@955 -- # kill 1286774 00:06:30.823 19:04:49 -- common/autotest_common.sh@960 -- # wait 1286774 00:06:31.391 19:04:49 -- accel/accel.sh@68 -- # trap - ERR 00:06:31.391 19:04:49 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:31.391 19:04:49 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:31.391 19:04:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.391 19:04:49 -- common/autotest_common.sh@10 -- # set +x 00:06:31.391 19:04:49 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:31.391 19:04:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:31.391 19:04:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.391 19:04:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.391 19:04:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.391 19:04:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.391 19:04:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.391 19:04:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.391 19:04:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.391 19:04:49 -- accel/accel.sh@42 -- # jq -r . 00:06:31.391 19:04:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.391 19:04:49 -- common/autotest_common.sh@10 -- # set +x 00:06:31.391 19:04:49 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:31.391 19:04:49 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:31.391 19:04:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.391 19:04:49 -- common/autotest_common.sh@10 -- # set +x 00:06:31.391 ************************************ 00:06:31.391 START TEST accel_missing_filename 00:06:31.391 ************************************ 00:06:31.391 19:04:49 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:31.391 19:04:49 -- common/autotest_common.sh@650 -- # local es=0 00:06:31.391 19:04:49 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:31.391 19:04:49 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:31.391 19:04:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.391 19:04:49 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:31.391 19:04:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.391 19:04:49 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:31.391 19:04:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:31.391 19:04:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.391 19:04:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.391 19:04:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.391 19:04:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.391 19:04:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.391 19:04:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.391 19:04:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.391 19:04:49 -- accel/accel.sh@42 -- # jq -r . 00:06:31.392 [2024-11-18 19:04:49.798336] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:31.392 [2024-11-18 19:04:49.798447] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1287078 ] 00:06:31.392 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.392 [2024-11-18 19:04:49.868915] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.392 [2024-11-18 19:04:49.937153] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.392 [2024-11-18 19:04:49.976595] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:31.651 [2024-11-18 19:04:50.037197] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:31.651 A filename is required. 00:06:31.651 19:04:50 -- common/autotest_common.sh@653 -- # es=234 00:06:31.651 19:04:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:31.651 19:04:50 -- common/autotest_common.sh@662 -- # es=106 00:06:31.651 19:04:50 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:31.651 19:04:50 -- common/autotest_common.sh@670 -- # es=1 00:06:31.651 19:04:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:31.651 00:06:31.651 real 0m0.330s 00:06:31.651 user 0m0.233s 00:06:31.651 sys 0m0.135s 00:06:31.651 19:04:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.651 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:31.651 ************************************ 00:06:31.651 END TEST accel_missing_filename 00:06:31.651 ************************************ 00:06:31.651 19:04:50 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.651 19:04:50 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:31.651 19:04:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.651 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:31.651 ************************************ 00:06:31.651 START TEST accel_compress_verify 00:06:31.651 ************************************ 00:06:31.651 19:04:50 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.651 19:04:50 -- common/autotest_common.sh@650 -- # local es=0 00:06:31.651 19:04:50 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.651 19:04:50 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:31.651 19:04:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.651 19:04:50 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:31.651 19:04:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:31.651 19:04:50 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.651 19:04:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:31.651 19:04:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.651 19:04:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.651 19:04:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.651 19:04:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.651 19:04:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.651 19:04:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.651 19:04:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.651 19:04:50 -- accel/accel.sh@42 -- # jq -r . 00:06:31.651 [2024-11-18 19:04:50.168483] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:31.651 [2024-11-18 19:04:50.168576] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1287119 ] 00:06:31.651 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.651 [2024-11-18 19:04:50.239271] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.911 [2024-11-18 19:04:50.309954] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.911 [2024-11-18 19:04:50.349926] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:31.911 [2024-11-18 19:04:50.410401] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:31.911 00:06:31.911 Compression does not support the verify option, aborting. 00:06:31.911 19:04:50 -- common/autotest_common.sh@653 -- # es=161 00:06:31.911 19:04:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:31.911 19:04:50 -- common/autotest_common.sh@662 -- # es=33 00:06:31.911 19:04:50 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:31.911 19:04:50 -- common/autotest_common.sh@670 -- # es=1 00:06:31.911 19:04:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:31.911 00:06:31.911 real 0m0.330s 00:06:31.911 user 0m0.235s 00:06:31.911 sys 0m0.132s 00:06:31.911 19:04:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.911 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:31.911 ************************************ 00:06:31.911 END TEST accel_compress_verify 00:06:31.911 ************************************ 00:06:32.170 19:04:50 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:32.170 19:04:50 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:32.170 19:04:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.170 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:32.170 ************************************ 00:06:32.170 START TEST accel_wrong_workload 00:06:32.170 ************************************ 00:06:32.170 19:04:50 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:32.170 19:04:50 -- common/autotest_common.sh@650 -- # local es=0 00:06:32.170 19:04:50 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:32.170 19:04:50 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:32.170 19:04:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:32.170 19:04:50 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:32.170 19:04:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:32.170 19:04:50 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:32.170 19:04:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:32.170 19:04:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.170 19:04:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.170 19:04:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.170 19:04:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.170 19:04:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.170 19:04:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.170 19:04:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.170 19:04:50 -- accel/accel.sh@42 -- # jq -r . 00:06:32.170 Unsupported workload type: foobar 00:06:32.170 [2024-11-18 19:04:50.541200] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:32.170 accel_perf options: 00:06:32.170 [-h help message] 00:06:32.170 [-q queue depth per core] 00:06:32.170 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:32.171 [-T number of threads per core 00:06:32.171 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:32.171 [-t time in seconds] 00:06:32.171 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:32.171 [ dif_verify, , dif_generate, dif_generate_copy 00:06:32.171 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:32.171 [-l for compress/decompress workloads, name of uncompressed input file 00:06:32.171 [-S for crc32c workload, use this seed value (default 0) 00:06:32.171 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:32.171 [-f for fill workload, use this BYTE value (default 255) 00:06:32.171 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:32.171 [-y verify result if this switch is on] 00:06:32.171 [-a tasks to allocate per core (default: same value as -q)] 00:06:32.171 Can be used to spread operations across a wider range of memory. 00:06:32.171 19:04:50 -- common/autotest_common.sh@653 -- # es=1 00:06:32.171 19:04:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:32.171 19:04:50 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:32.171 19:04:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:32.171 00:06:32.171 real 0m0.025s 00:06:32.171 user 0m0.009s 00:06:32.171 sys 0m0.016s 00:06:32.171 19:04:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.171 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:32.171 ************************************ 00:06:32.171 END TEST accel_wrong_workload 00:06:32.171 ************************************ 00:06:32.171 Error: writing output failed: Broken pipe 00:06:32.171 19:04:50 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:32.171 19:04:50 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:32.171 19:04:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.171 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:32.171 ************************************ 00:06:32.171 START TEST accel_negative_buffers 00:06:32.171 ************************************ 00:06:32.171 19:04:50 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:32.171 19:04:50 -- common/autotest_common.sh@650 -- # local es=0 00:06:32.171 19:04:50 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:32.171 19:04:50 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:32.171 19:04:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:32.171 19:04:50 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:32.171 19:04:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:32.171 19:04:50 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:32.171 19:04:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:32.171 19:04:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.171 19:04:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.171 19:04:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.171 19:04:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.171 19:04:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.171 19:04:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.171 19:04:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.171 19:04:50 -- accel/accel.sh@42 -- # jq -r . 00:06:32.171 -x option must be non-negative. 00:06:32.171 [2024-11-18 19:04:50.618150] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:32.171 accel_perf options: 00:06:32.171 [-h help message] 00:06:32.171 [-q queue depth per core] 00:06:32.171 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:32.171 [-T number of threads per core 00:06:32.171 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:32.171 [-t time in seconds] 00:06:32.171 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:32.171 [ dif_verify, , dif_generate, dif_generate_copy 00:06:32.171 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:32.171 [-l for compress/decompress workloads, name of uncompressed input file 00:06:32.171 [-S for crc32c workload, use this seed value (default 0) 00:06:32.171 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:32.171 [-f for fill workload, use this BYTE value (default 255) 00:06:32.171 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:32.171 [-y verify result if this switch is on] 00:06:32.171 [-a tasks to allocate per core (default: same value as -q)] 00:06:32.171 Can be used to spread operations across a wider range of memory. 00:06:32.171 19:04:50 -- common/autotest_common.sh@653 -- # es=1 00:06:32.171 19:04:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:32.171 19:04:50 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:32.171 19:04:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:32.171 00:06:32.171 real 0m0.030s 00:06:32.171 user 0m0.014s 00:06:32.171 sys 0m0.016s 00:06:32.171 19:04:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.171 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:32.171 ************************************ 00:06:32.171 END TEST accel_negative_buffers 00:06:32.171 ************************************ 00:06:32.171 Error: writing output failed: Broken pipe 00:06:32.171 19:04:50 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:32.171 19:04:50 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:32.171 19:04:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.171 19:04:50 -- common/autotest_common.sh@10 -- # set +x 00:06:32.171 ************************************ 00:06:32.171 START TEST accel_crc32c 00:06:32.171 ************************************ 00:06:32.171 19:04:50 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:32.171 19:04:50 -- accel/accel.sh@16 -- # local accel_opc 00:06:32.171 19:04:50 -- accel/accel.sh@17 -- # local accel_module 00:06:32.171 19:04:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:32.171 19:04:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:32.171 19:04:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:32.171 19:04:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.171 19:04:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.171 19:04:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.171 19:04:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.171 19:04:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.171 19:04:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.171 19:04:50 -- accel/accel.sh@42 -- # jq -r . 00:06:32.171 [2024-11-18 19:04:50.688625] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:32.171 [2024-11-18 19:04:50.688713] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1287409 ] 00:06:32.171 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.171 [2024-11-18 19:04:50.759750] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.431 [2024-11-18 19:04:50.834413] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.509 19:04:52 -- accel/accel.sh@18 -- # out=' 00:06:33.509 SPDK Configuration: 00:06:33.509 Core mask: 0x1 00:06:33.509 00:06:33.509 Accel Perf Configuration: 00:06:33.509 Workload Type: crc32c 00:06:33.509 CRC-32C seed: 32 00:06:33.509 Transfer size: 4096 bytes 00:06:33.509 Vector count 1 00:06:33.509 Module: software 00:06:33.509 Queue depth: 32 00:06:33.509 Allocate depth: 32 00:06:33.509 # threads/core: 1 00:06:33.509 Run time: 1 seconds 00:06:33.509 Verify: Yes 00:06:33.509 00:06:33.509 Running for 1 seconds... 00:06:33.509 00:06:33.509 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:33.509 ------------------------------------------------------------------------------------ 00:06:33.509 0,0 838560/s 3275 MiB/s 0 0 00:06:33.509 ==================================================================================== 00:06:33.509 Total 838560/s 3275 MiB/s 0 0' 00:06:33.509 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.509 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.509 19:04:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:33.509 19:04:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:33.509 19:04:52 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.509 19:04:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.509 19:04:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.509 19:04:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.509 19:04:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.509 19:04:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.509 19:04:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.509 19:04:52 -- accel/accel.sh@42 -- # jq -r . 00:06:33.509 [2024-11-18 19:04:52.024317] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:33.509 [2024-11-18 19:04:52.024412] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1287601 ] 00:06:33.509 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.779 [2024-11-18 19:04:52.093614] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.779 [2024-11-18 19:04:52.168241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val= 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val= 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val=0x1 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val= 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val= 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val=crc32c 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val=32 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val= 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val=software 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@23 -- # accel_module=software 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val=32 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val=32 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val=1 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.779 19:04:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:33.779 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.779 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.780 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.780 19:04:52 -- accel/accel.sh@21 -- # val=Yes 00:06:33.780 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.780 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.780 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.780 19:04:52 -- accel/accel.sh@21 -- # val= 00:06:33.780 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.780 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.780 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:33.780 19:04:52 -- accel/accel.sh@21 -- # val= 00:06:33.780 19:04:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.780 19:04:52 -- accel/accel.sh@20 -- # IFS=: 00:06:33.780 19:04:52 -- accel/accel.sh@20 -- # read -r var val 00:06:35.159 19:04:53 -- accel/accel.sh@21 -- # val= 00:06:35.159 19:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:35.159 19:04:53 -- accel/accel.sh@21 -- # val= 00:06:35.159 19:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:35.159 19:04:53 -- accel/accel.sh@21 -- # val= 00:06:35.159 19:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:35.159 19:04:53 -- accel/accel.sh@21 -- # val= 00:06:35.159 19:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:35.159 19:04:53 -- accel/accel.sh@21 -- # val= 00:06:35.159 19:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:35.159 19:04:53 -- accel/accel.sh@21 -- # val= 00:06:35.159 19:04:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # IFS=: 00:06:35.159 19:04:53 -- accel/accel.sh@20 -- # read -r var val 00:06:35.159 19:04:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:35.159 19:04:53 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:35.159 19:04:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:35.159 00:06:35.159 real 0m2.672s 00:06:35.159 user 0m2.417s 00:06:35.159 sys 0m0.255s 00:06:35.159 19:04:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.160 19:04:53 -- common/autotest_common.sh@10 -- # set +x 00:06:35.160 ************************************ 00:06:35.160 END TEST accel_crc32c 00:06:35.160 ************************************ 00:06:35.160 19:04:53 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:35.160 19:04:53 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:35.160 19:04:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.160 19:04:53 -- common/autotest_common.sh@10 -- # set +x 00:06:35.160 ************************************ 00:06:35.160 START TEST accel_crc32c_C2 00:06:35.160 ************************************ 00:06:35.160 19:04:53 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:35.160 19:04:53 -- accel/accel.sh@16 -- # local accel_opc 00:06:35.160 19:04:53 -- accel/accel.sh@17 -- # local accel_module 00:06:35.160 19:04:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:35.160 19:04:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:35.160 19:04:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.160 19:04:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.160 19:04:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.160 19:04:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.160 19:04:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.160 19:04:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.160 19:04:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.160 19:04:53 -- accel/accel.sh@42 -- # jq -r . 00:06:35.160 [2024-11-18 19:04:53.406050] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:35.160 [2024-11-18 19:04:53.406153] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1287813 ] 00:06:35.160 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.160 [2024-11-18 19:04:53.477544] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.160 [2024-11-18 19:04:53.551268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.539 19:04:54 -- accel/accel.sh@18 -- # out=' 00:06:36.539 SPDK Configuration: 00:06:36.539 Core mask: 0x1 00:06:36.539 00:06:36.539 Accel Perf Configuration: 00:06:36.539 Workload Type: crc32c 00:06:36.539 CRC-32C seed: 0 00:06:36.539 Transfer size: 4096 bytes 00:06:36.539 Vector count 2 00:06:36.539 Module: software 00:06:36.539 Queue depth: 32 00:06:36.539 Allocate depth: 32 00:06:36.539 # threads/core: 1 00:06:36.539 Run time: 1 seconds 00:06:36.539 Verify: Yes 00:06:36.539 00:06:36.539 Running for 1 seconds... 00:06:36.539 00:06:36.539 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:36.539 ------------------------------------------------------------------------------------ 00:06:36.539 0,0 619616/s 4840 MiB/s 0 0 00:06:36.539 ==================================================================================== 00:06:36.539 Total 619616/s 2420 MiB/s 0 0' 00:06:36.539 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.539 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.539 19:04:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:36.539 19:04:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:36.539 19:04:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.539 19:04:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.539 19:04:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.539 19:04:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.539 19:04:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.539 19:04:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.539 19:04:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.539 19:04:54 -- accel/accel.sh@42 -- # jq -r . 00:06:36.539 [2024-11-18 19:04:54.740999] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:36.540 [2024-11-18 19:04:54.741087] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1288001 ] 00:06:36.540 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.540 [2024-11-18 19:04:54.812051] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.540 [2024-11-18 19:04:54.879361] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val= 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val= 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=0x1 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val= 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val= 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=crc32c 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=0 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val= 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=software 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@23 -- # accel_module=software 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=32 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=32 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=1 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val=Yes 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val= 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:36.540 19:04:54 -- accel/accel.sh@21 -- # val= 00:06:36.540 19:04:54 -- accel/accel.sh@22 -- # case "$var" in 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # IFS=: 00:06:36.540 19:04:54 -- accel/accel.sh@20 -- # read -r var val 00:06:37.480 19:04:56 -- accel/accel.sh@21 -- # val= 00:06:37.480 19:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:37.480 19:04:56 -- accel/accel.sh@21 -- # val= 00:06:37.480 19:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:37.480 19:04:56 -- accel/accel.sh@21 -- # val= 00:06:37.480 19:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:37.480 19:04:56 -- accel/accel.sh@21 -- # val= 00:06:37.480 19:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:37.480 19:04:56 -- accel/accel.sh@21 -- # val= 00:06:37.480 19:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:37.480 19:04:56 -- accel/accel.sh@21 -- # val= 00:06:37.480 19:04:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # IFS=: 00:06:37.480 19:04:56 -- accel/accel.sh@20 -- # read -r var val 00:06:37.480 19:04:56 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:37.480 19:04:56 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:37.480 19:04:56 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:37.480 00:06:37.480 real 0m2.666s 00:06:37.480 user 0m2.410s 00:06:37.480 sys 0m0.255s 00:06:37.480 19:04:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:37.480 19:04:56 -- common/autotest_common.sh@10 -- # set +x 00:06:37.480 ************************************ 00:06:37.480 END TEST accel_crc32c_C2 00:06:37.480 ************************************ 00:06:37.740 19:04:56 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:37.740 19:04:56 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:37.740 19:04:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.740 19:04:56 -- common/autotest_common.sh@10 -- # set +x 00:06:37.740 ************************************ 00:06:37.740 START TEST accel_copy 00:06:37.740 ************************************ 00:06:37.740 19:04:56 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:37.740 19:04:56 -- accel/accel.sh@16 -- # local accel_opc 00:06:37.740 19:04:56 -- accel/accel.sh@17 -- # local accel_module 00:06:37.740 19:04:56 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:37.740 19:04:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:37.740 19:04:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.740 19:04:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.740 19:04:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.740 19:04:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.740 19:04:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.740 19:04:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.740 19:04:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.740 19:04:56 -- accel/accel.sh@42 -- # jq -r . 00:06:37.740 [2024-11-18 19:04:56.112244] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:37.740 [2024-11-18 19:04:56.112315] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1288275 ] 00:06:37.740 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.740 [2024-11-18 19:04:56.179933] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.740 [2024-11-18 19:04:56.248012] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.119 19:04:57 -- accel/accel.sh@18 -- # out=' 00:06:39.119 SPDK Configuration: 00:06:39.119 Core mask: 0x1 00:06:39.119 00:06:39.119 Accel Perf Configuration: 00:06:39.119 Workload Type: copy 00:06:39.120 Transfer size: 4096 bytes 00:06:39.120 Vector count 1 00:06:39.120 Module: software 00:06:39.120 Queue depth: 32 00:06:39.120 Allocate depth: 32 00:06:39.120 # threads/core: 1 00:06:39.120 Run time: 1 seconds 00:06:39.120 Verify: Yes 00:06:39.120 00:06:39.120 Running for 1 seconds... 00:06:39.120 00:06:39.120 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:39.120 ------------------------------------------------------------------------------------ 00:06:39.120 0,0 548704/s 2143 MiB/s 0 0 00:06:39.120 ==================================================================================== 00:06:39.120 Total 548704/s 2143 MiB/s 0 0' 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:39.120 19:04:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:39.120 19:04:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.120 19:04:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.120 19:04:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.120 19:04:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.120 19:04:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.120 19:04:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.120 19:04:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.120 19:04:57 -- accel/accel.sh@42 -- # jq -r . 00:06:39.120 [2024-11-18 19:04:57.439675] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:39.120 [2024-11-18 19:04:57.439773] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1288541 ] 00:06:39.120 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.120 [2024-11-18 19:04:57.511853] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.120 [2024-11-18 19:04:57.576094] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val= 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val= 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val=0x1 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val= 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val= 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val=copy 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val= 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val=software 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@23 -- # accel_module=software 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val=32 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val=32 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val=1 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val=Yes 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val= 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.120 19:04:57 -- accel/accel.sh@21 -- # val= 00:06:39.120 19:04:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.120 19:04:57 -- accel/accel.sh@20 -- # read -r var val 00:06:40.498 19:04:58 -- accel/accel.sh@21 -- # val= 00:06:40.498 19:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:40.498 19:04:58 -- accel/accel.sh@21 -- # val= 00:06:40.498 19:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:40.498 19:04:58 -- accel/accel.sh@21 -- # val= 00:06:40.498 19:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:40.498 19:04:58 -- accel/accel.sh@21 -- # val= 00:06:40.498 19:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:40.498 19:04:58 -- accel/accel.sh@21 -- # val= 00:06:40.498 19:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:40.498 19:04:58 -- accel/accel.sh@21 -- # val= 00:06:40.498 19:04:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # IFS=: 00:06:40.498 19:04:58 -- accel/accel.sh@20 -- # read -r var val 00:06:40.498 19:04:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:40.498 19:04:58 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:40.498 19:04:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.498 00:06:40.498 real 0m2.654s 00:06:40.498 user 0m2.396s 00:06:40.498 sys 0m0.256s 00:06:40.498 19:04:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:40.498 19:04:58 -- common/autotest_common.sh@10 -- # set +x 00:06:40.498 ************************************ 00:06:40.498 END TEST accel_copy 00:06:40.498 ************************************ 00:06:40.498 19:04:58 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.498 19:04:58 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:40.498 19:04:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:40.498 19:04:58 -- common/autotest_common.sh@10 -- # set +x 00:06:40.498 ************************************ 00:06:40.498 START TEST accel_fill 00:06:40.498 ************************************ 00:06:40.498 19:04:58 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.498 19:04:58 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.498 19:04:58 -- accel/accel.sh@17 -- # local accel_module 00:06:40.498 19:04:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.498 19:04:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:40.498 19:04:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.498 19:04:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.498 19:04:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.498 19:04:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.498 19:04:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.498 19:04:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.498 19:04:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.498 19:04:58 -- accel/accel.sh@42 -- # jq -r . 00:06:40.498 [2024-11-18 19:04:58.812818] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:40.498 [2024-11-18 19:04:58.812905] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1288830 ] 00:06:40.498 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.498 [2024-11-18 19:04:58.883941] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.498 [2024-11-18 19:04:58.951859] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.885 19:05:00 -- accel/accel.sh@18 -- # out=' 00:06:41.885 SPDK Configuration: 00:06:41.885 Core mask: 0x1 00:06:41.885 00:06:41.885 Accel Perf Configuration: 00:06:41.885 Workload Type: fill 00:06:41.885 Fill pattern: 0x80 00:06:41.885 Transfer size: 4096 bytes 00:06:41.885 Vector count 1 00:06:41.885 Module: software 00:06:41.885 Queue depth: 64 00:06:41.885 Allocate depth: 64 00:06:41.885 # threads/core: 1 00:06:41.885 Run time: 1 seconds 00:06:41.885 Verify: Yes 00:06:41.885 00:06:41.885 Running for 1 seconds... 00:06:41.885 00:06:41.885 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:41.885 ------------------------------------------------------------------------------------ 00:06:41.885 0,0 965760/s 3772 MiB/s 0 0 00:06:41.885 ==================================================================================== 00:06:41.885 Total 965760/s 3772 MiB/s 0 0' 00:06:41.885 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.885 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.885 19:05:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:41.885 19:05:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:41.885 19:05:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.885 19:05:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.885 19:05:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.885 19:05:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.885 19:05:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.885 19:05:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.885 19:05:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.886 19:05:00 -- accel/accel.sh@42 -- # jq -r . 00:06:41.886 [2024-11-18 19:05:00.136010] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:41.886 [2024-11-18 19:05:00.136081] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1289105 ] 00:06:41.886 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.886 [2024-11-18 19:05:00.204296] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.886 [2024-11-18 19:05:00.278596] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val= 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val= 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=0x1 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val= 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val= 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=fill 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=0x80 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val= 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=software 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@23 -- # accel_module=software 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=64 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=64 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=1 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val=Yes 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val= 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:41.886 19:05:00 -- accel/accel.sh@21 -- # val= 00:06:41.886 19:05:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # IFS=: 00:06:41.886 19:05:00 -- accel/accel.sh@20 -- # read -r var val 00:06:43.266 19:05:01 -- accel/accel.sh@21 -- # val= 00:06:43.266 19:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.266 19:05:01 -- accel/accel.sh@21 -- # val= 00:06:43.266 19:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.266 19:05:01 -- accel/accel.sh@21 -- # val= 00:06:43.266 19:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.266 19:05:01 -- accel/accel.sh@21 -- # val= 00:06:43.266 19:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.266 19:05:01 -- accel/accel.sh@21 -- # val= 00:06:43.266 19:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.266 19:05:01 -- accel/accel.sh@21 -- # val= 00:06:43.266 19:05:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.266 19:05:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.266 19:05:01 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.266 19:05:01 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:43.266 19:05:01 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.266 00:06:43.266 real 0m2.660s 00:06:43.266 user 0m2.402s 00:06:43.266 sys 0m0.256s 00:06:43.266 19:05:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:43.266 19:05:01 -- common/autotest_common.sh@10 -- # set +x 00:06:43.266 ************************************ 00:06:43.266 END TEST accel_fill 00:06:43.266 ************************************ 00:06:43.266 19:05:01 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:43.266 19:05:01 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:43.266 19:05:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.266 19:05:01 -- common/autotest_common.sh@10 -- # set +x 00:06:43.266 ************************************ 00:06:43.266 START TEST accel_copy_crc32c 00:06:43.266 ************************************ 00:06:43.266 19:05:01 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:43.266 19:05:01 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.266 19:05:01 -- accel/accel.sh@17 -- # local accel_module 00:06:43.266 19:05:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:43.266 19:05:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:43.266 19:05:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.266 19:05:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.266 19:05:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.266 19:05:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.266 19:05:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.266 19:05:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.266 19:05:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.266 19:05:01 -- accel/accel.sh@42 -- # jq -r . 00:06:43.266 [2024-11-18 19:05:01.520659] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:43.266 [2024-11-18 19:05:01.520737] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1289388 ] 00:06:43.266 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.266 [2024-11-18 19:05:01.589867] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.266 [2024-11-18 19:05:01.659576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.646 19:05:02 -- accel/accel.sh@18 -- # out=' 00:06:44.646 SPDK Configuration: 00:06:44.646 Core mask: 0x1 00:06:44.646 00:06:44.646 Accel Perf Configuration: 00:06:44.646 Workload Type: copy_crc32c 00:06:44.646 CRC-32C seed: 0 00:06:44.646 Vector size: 4096 bytes 00:06:44.646 Transfer size: 4096 bytes 00:06:44.646 Vector count 1 00:06:44.646 Module: software 00:06:44.646 Queue depth: 32 00:06:44.646 Allocate depth: 32 00:06:44.646 # threads/core: 1 00:06:44.646 Run time: 1 seconds 00:06:44.646 Verify: Yes 00:06:44.646 00:06:44.646 Running for 1 seconds... 00:06:44.646 00:06:44.646 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:44.646 ------------------------------------------------------------------------------------ 00:06:44.646 0,0 428096/s 1672 MiB/s 0 0 00:06:44.646 ==================================================================================== 00:06:44.646 Total 428096/s 1672 MiB/s 0 0' 00:06:44.647 19:05:02 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:02 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:44.647 19:05:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:44.647 19:05:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.647 19:05:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.647 19:05:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.647 19:05:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.647 19:05:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.647 19:05:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.647 19:05:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.647 19:05:02 -- accel/accel.sh@42 -- # jq -r . 00:06:44.647 [2024-11-18 19:05:02.851309] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:44.647 [2024-11-18 19:05:02.851382] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1289616 ] 00:06:44.647 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.647 [2024-11-18 19:05:02.919978] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.647 [2024-11-18 19:05:02.988484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val= 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val= 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=0x1 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val= 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val= 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=0 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val= 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=software 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@23 -- # accel_module=software 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=32 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=32 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=1 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val=Yes 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val= 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:44.647 19:05:03 -- accel/accel.sh@21 -- # val= 00:06:44.647 19:05:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # IFS=: 00:06:44.647 19:05:03 -- accel/accel.sh@20 -- # read -r var val 00:06:45.585 19:05:04 -- accel/accel.sh@21 -- # val= 00:06:45.585 19:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:45.585 19:05:04 -- accel/accel.sh@21 -- # val= 00:06:45.585 19:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:45.585 19:05:04 -- accel/accel.sh@21 -- # val= 00:06:45.585 19:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:45.585 19:05:04 -- accel/accel.sh@21 -- # val= 00:06:45.585 19:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:45.585 19:05:04 -- accel/accel.sh@21 -- # val= 00:06:45.585 19:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:45.585 19:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:45.586 19:05:04 -- accel/accel.sh@21 -- # val= 00:06:45.586 19:05:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.586 19:05:04 -- accel/accel.sh@20 -- # IFS=: 00:06:45.586 19:05:04 -- accel/accel.sh@20 -- # read -r var val 00:06:45.586 19:05:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:45.586 19:05:04 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:45.586 19:05:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:45.586 00:06:45.586 real 0m2.659s 00:06:45.586 user 0m2.413s 00:06:45.586 sys 0m0.246s 00:06:45.586 19:05:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:45.586 19:05:04 -- common/autotest_common.sh@10 -- # set +x 00:06:45.586 ************************************ 00:06:45.586 END TEST accel_copy_crc32c 00:06:45.586 ************************************ 00:06:45.845 19:05:04 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:45.845 19:05:04 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:45.845 19:05:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:45.845 19:05:04 -- common/autotest_common.sh@10 -- # set +x 00:06:45.845 ************************************ 00:06:45.845 START TEST accel_copy_crc32c_C2 00:06:45.845 ************************************ 00:06:45.845 19:05:04 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:45.845 19:05:04 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.845 19:05:04 -- accel/accel.sh@17 -- # local accel_module 00:06:45.845 19:05:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:45.845 19:05:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:45.845 19:05:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.845 19:05:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.845 19:05:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.845 19:05:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.845 19:05:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.845 19:05:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.845 19:05:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.845 19:05:04 -- accel/accel.sh@42 -- # jq -r . 00:06:45.845 [2024-11-18 19:05:04.223371] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:45.845 [2024-11-18 19:05:04.223460] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1289819 ] 00:06:45.845 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.845 [2024-11-18 19:05:04.292723] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.845 [2024-11-18 19:05:04.361903] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.225 19:05:05 -- accel/accel.sh@18 -- # out=' 00:06:47.225 SPDK Configuration: 00:06:47.225 Core mask: 0x1 00:06:47.225 00:06:47.225 Accel Perf Configuration: 00:06:47.225 Workload Type: copy_crc32c 00:06:47.225 CRC-32C seed: 0 00:06:47.225 Vector size: 4096 bytes 00:06:47.225 Transfer size: 8192 bytes 00:06:47.225 Vector count 2 00:06:47.225 Module: software 00:06:47.225 Queue depth: 32 00:06:47.225 Allocate depth: 32 00:06:47.225 # threads/core: 1 00:06:47.225 Run time: 1 seconds 00:06:47.225 Verify: Yes 00:06:47.225 00:06:47.225 Running for 1 seconds... 00:06:47.225 00:06:47.225 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:47.225 ------------------------------------------------------------------------------------ 00:06:47.225 0,0 299488/s 2339 MiB/s 0 0 00:06:47.225 ==================================================================================== 00:06:47.225 Total 299488/s 1169 MiB/s 0 0' 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:47.225 19:05:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:47.225 19:05:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.225 19:05:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.225 19:05:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.225 19:05:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.225 19:05:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.225 19:05:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.225 19:05:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.225 19:05:05 -- accel/accel.sh@42 -- # jq -r . 00:06:47.225 [2024-11-18 19:05:05.552910] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:47.225 [2024-11-18 19:05:05.552995] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1289981 ] 00:06:47.225 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.225 [2024-11-18 19:05:05.623200] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.225 [2024-11-18 19:05:05.690314] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val= 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val= 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=0x1 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val= 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val= 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=0 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val= 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=software 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@23 -- # accel_module=software 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=32 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=32 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=1 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val=Yes 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val= 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.225 19:05:05 -- accel/accel.sh@21 -- # val= 00:06:47.225 19:05:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.225 19:05:05 -- accel/accel.sh@20 -- # read -r var val 00:06:48.605 19:05:06 -- accel/accel.sh@21 -- # val= 00:06:48.605 19:05:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.605 19:05:06 -- accel/accel.sh@21 -- # val= 00:06:48.605 19:05:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.605 19:05:06 -- accel/accel.sh@21 -- # val= 00:06:48.605 19:05:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.605 19:05:06 -- accel/accel.sh@21 -- # val= 00:06:48.605 19:05:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.605 19:05:06 -- accel/accel.sh@21 -- # val= 00:06:48.605 19:05:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.605 19:05:06 -- accel/accel.sh@21 -- # val= 00:06:48.605 19:05:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.605 19:05:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.605 19:05:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:48.605 19:05:06 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:48.605 19:05:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.605 00:06:48.605 real 0m2.659s 00:06:48.605 user 0m2.396s 00:06:48.605 sys 0m0.263s 00:06:48.605 19:05:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:48.605 19:05:06 -- common/autotest_common.sh@10 -- # set +x 00:06:48.605 ************************************ 00:06:48.605 END TEST accel_copy_crc32c_C2 00:06:48.605 ************************************ 00:06:48.605 19:05:06 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:48.605 19:05:06 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:48.605 19:05:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:48.605 19:05:06 -- common/autotest_common.sh@10 -- # set +x 00:06:48.605 ************************************ 00:06:48.605 START TEST accel_dualcast 00:06:48.605 ************************************ 00:06:48.605 19:05:06 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:48.605 19:05:06 -- accel/accel.sh@16 -- # local accel_opc 00:06:48.605 19:05:06 -- accel/accel.sh@17 -- # local accel_module 00:06:48.605 19:05:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:48.605 19:05:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:48.605 19:05:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.605 19:05:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.605 19:05:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.605 19:05:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.605 19:05:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.605 19:05:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.605 19:05:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.605 19:05:06 -- accel/accel.sh@42 -- # jq -r . 00:06:48.605 [2024-11-18 19:05:06.927105] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:48.605 [2024-11-18 19:05:06.927193] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290252 ] 00:06:48.605 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.605 [2024-11-18 19:05:06.999450] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.605 [2024-11-18 19:05:07.067663] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.984 19:05:08 -- accel/accel.sh@18 -- # out=' 00:06:49.984 SPDK Configuration: 00:06:49.984 Core mask: 0x1 00:06:49.984 00:06:49.984 Accel Perf Configuration: 00:06:49.984 Workload Type: dualcast 00:06:49.984 Transfer size: 4096 bytes 00:06:49.984 Vector count 1 00:06:49.984 Module: software 00:06:49.984 Queue depth: 32 00:06:49.984 Allocate depth: 32 00:06:49.984 # threads/core: 1 00:06:49.984 Run time: 1 seconds 00:06:49.984 Verify: Yes 00:06:49.984 00:06:49.984 Running for 1 seconds... 00:06:49.984 00:06:49.984 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:49.984 ------------------------------------------------------------------------------------ 00:06:49.984 0,0 629024/s 2457 MiB/s 0 0 00:06:49.984 ==================================================================================== 00:06:49.984 Total 629024/s 2457 MiB/s 0 0' 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:49.984 19:05:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:49.984 19:05:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.984 19:05:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.984 19:05:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.984 19:05:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.984 19:05:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.984 19:05:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.984 19:05:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.984 19:05:08 -- accel/accel.sh@42 -- # jq -r . 00:06:49.984 [2024-11-18 19:05:08.259397] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:49.984 [2024-11-18 19:05:08.259483] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290523 ] 00:06:49.984 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.984 [2024-11-18 19:05:08.329963] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.984 [2024-11-18 19:05:08.396120] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val= 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val= 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val=0x1 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val= 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val= 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val=dualcast 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val= 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val=software 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@23 -- # accel_module=software 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.984 19:05:08 -- accel/accel.sh@21 -- # val=32 00:06:49.984 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.984 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.985 19:05:08 -- accel/accel.sh@21 -- # val=32 00:06:49.985 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.985 19:05:08 -- accel/accel.sh@21 -- # val=1 00:06:49.985 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.985 19:05:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:49.985 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.985 19:05:08 -- accel/accel.sh@21 -- # val=Yes 00:06:49.985 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.985 19:05:08 -- accel/accel.sh@21 -- # val= 00:06:49.985 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:49.985 19:05:08 -- accel/accel.sh@21 -- # val= 00:06:49.985 19:05:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # IFS=: 00:06:49.985 19:05:08 -- accel/accel.sh@20 -- # read -r var val 00:06:51.364 19:05:09 -- accel/accel.sh@21 -- # val= 00:06:51.364 19:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.364 19:05:09 -- accel/accel.sh@21 -- # val= 00:06:51.364 19:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.364 19:05:09 -- accel/accel.sh@21 -- # val= 00:06:51.364 19:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.364 19:05:09 -- accel/accel.sh@21 -- # val= 00:06:51.364 19:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.364 19:05:09 -- accel/accel.sh@21 -- # val= 00:06:51.364 19:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.364 19:05:09 -- accel/accel.sh@21 -- # val= 00:06:51.364 19:05:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.364 19:05:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.364 19:05:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.364 19:05:09 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:51.364 19:05:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.364 00:06:51.364 real 0m2.661s 00:06:51.364 user 0m2.387s 00:06:51.364 sys 0m0.272s 00:06:51.364 19:05:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.364 19:05:09 -- common/autotest_common.sh@10 -- # set +x 00:06:51.364 ************************************ 00:06:51.364 END TEST accel_dualcast 00:06:51.364 ************************************ 00:06:51.364 19:05:09 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:51.364 19:05:09 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:51.364 19:05:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:51.364 19:05:09 -- common/autotest_common.sh@10 -- # set +x 00:06:51.364 ************************************ 00:06:51.364 START TEST accel_compare 00:06:51.364 ************************************ 00:06:51.364 19:05:09 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:51.364 19:05:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.364 19:05:09 -- accel/accel.sh@17 -- # local accel_module 00:06:51.364 19:05:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:51.364 19:05:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:51.364 19:05:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.364 19:05:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.364 19:05:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.364 19:05:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.364 19:05:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.364 19:05:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.364 19:05:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.364 19:05:09 -- accel/accel.sh@42 -- # jq -r . 00:06:51.364 [2024-11-18 19:05:09.631687] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:51.364 [2024-11-18 19:05:09.631760] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1290804 ] 00:06:51.364 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.364 [2024-11-18 19:05:09.700336] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.364 [2024-11-18 19:05:09.767753] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.743 19:05:10 -- accel/accel.sh@18 -- # out=' 00:06:52.743 SPDK Configuration: 00:06:52.743 Core mask: 0x1 00:06:52.743 00:06:52.743 Accel Perf Configuration: 00:06:52.743 Workload Type: compare 00:06:52.743 Transfer size: 4096 bytes 00:06:52.743 Vector count 1 00:06:52.743 Module: software 00:06:52.743 Queue depth: 32 00:06:52.743 Allocate depth: 32 00:06:52.743 # threads/core: 1 00:06:52.743 Run time: 1 seconds 00:06:52.743 Verify: Yes 00:06:52.743 00:06:52.743 Running for 1 seconds... 00:06:52.743 00:06:52.743 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:52.743 ------------------------------------------------------------------------------------ 00:06:52.743 0,0 784704/s 3065 MiB/s 0 0 00:06:52.743 ==================================================================================== 00:06:52.743 Total 784704/s 3065 MiB/s 0 0' 00:06:52.743 19:05:10 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:10 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:52.744 19:05:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:52.744 19:05:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.744 19:05:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.744 19:05:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.744 19:05:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.744 19:05:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.744 19:05:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.744 19:05:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.744 19:05:10 -- accel/accel.sh@42 -- # jq -r . 00:06:52.744 [2024-11-18 19:05:10.959941] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:52.744 [2024-11-18 19:05:10.960025] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291072 ] 00:06:52.744 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.744 [2024-11-18 19:05:11.031751] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.744 [2024-11-18 19:05:11.097751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val= 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val= 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val=0x1 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val= 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val= 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val=compare 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val= 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val=software 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@23 -- # accel_module=software 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val=32 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val=32 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val=1 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val=Yes 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val= 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:52.744 19:05:11 -- accel/accel.sh@21 -- # val= 00:06:52.744 19:05:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # IFS=: 00:06:52.744 19:05:11 -- accel/accel.sh@20 -- # read -r var val 00:06:53.681 19:05:12 -- accel/accel.sh@21 -- # val= 00:06:53.681 19:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.681 19:05:12 -- accel/accel.sh@21 -- # val= 00:06:53.681 19:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.681 19:05:12 -- accel/accel.sh@21 -- # val= 00:06:53.681 19:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.681 19:05:12 -- accel/accel.sh@21 -- # val= 00:06:53.681 19:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.681 19:05:12 -- accel/accel.sh@21 -- # val= 00:06:53.681 19:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.681 19:05:12 -- accel/accel.sh@21 -- # val= 00:06:53.681 19:05:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.681 19:05:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.681 19:05:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:53.681 19:05:12 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:53.681 19:05:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.681 00:06:53.681 real 0m2.659s 00:06:53.681 user 0m2.409s 00:06:53.681 sys 0m0.248s 00:06:53.681 19:05:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:53.681 19:05:12 -- common/autotest_common.sh@10 -- # set +x 00:06:53.681 ************************************ 00:06:53.681 END TEST accel_compare 00:06:53.681 ************************************ 00:06:53.941 19:05:12 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:53.941 19:05:12 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:53.941 19:05:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:53.941 19:05:12 -- common/autotest_common.sh@10 -- # set +x 00:06:53.941 ************************************ 00:06:53.941 START TEST accel_xor 00:06:53.941 ************************************ 00:06:53.941 19:05:12 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:06:53.941 19:05:12 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.941 19:05:12 -- accel/accel.sh@17 -- # local accel_module 00:06:53.941 19:05:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:53.941 19:05:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:53.941 19:05:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.941 19:05:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.941 19:05:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.941 19:05:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.941 19:05:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.941 19:05:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.941 19:05:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.941 19:05:12 -- accel/accel.sh@42 -- # jq -r . 00:06:53.941 [2024-11-18 19:05:12.334910] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:53.941 [2024-11-18 19:05:12.334995] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291359 ] 00:06:53.941 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.941 [2024-11-18 19:05:12.405588] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.941 [2024-11-18 19:05:12.473079] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.321 19:05:13 -- accel/accel.sh@18 -- # out=' 00:06:55.321 SPDK Configuration: 00:06:55.321 Core mask: 0x1 00:06:55.321 00:06:55.321 Accel Perf Configuration: 00:06:55.321 Workload Type: xor 00:06:55.321 Source buffers: 2 00:06:55.321 Transfer size: 4096 bytes 00:06:55.321 Vector count 1 00:06:55.321 Module: software 00:06:55.321 Queue depth: 32 00:06:55.321 Allocate depth: 32 00:06:55.321 # threads/core: 1 00:06:55.321 Run time: 1 seconds 00:06:55.321 Verify: Yes 00:06:55.321 00:06:55.321 Running for 1 seconds... 00:06:55.321 00:06:55.321 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:55.321 ------------------------------------------------------------------------------------ 00:06:55.321 0,0 703936/s 2749 MiB/s 0 0 00:06:55.321 ==================================================================================== 00:06:55.321 Total 703936/s 2749 MiB/s 0 0' 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:55.321 19:05:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:55.321 19:05:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.321 19:05:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.321 19:05:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.321 19:05:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.321 19:05:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.321 19:05:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.321 19:05:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.321 19:05:13 -- accel/accel.sh@42 -- # jq -r . 00:06:55.321 [2024-11-18 19:05:13.662932] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:55.321 [2024-11-18 19:05:13.663017] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291590 ] 00:06:55.321 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.321 [2024-11-18 19:05:13.732401] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.321 [2024-11-18 19:05:13.798901] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val= 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val= 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val=0x1 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val= 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val= 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val=xor 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val=2 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val= 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val=software 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@23 -- # accel_module=software 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val=32 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val=32 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val=1 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.321 19:05:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:55.321 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.321 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.322 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.322 19:05:13 -- accel/accel.sh@21 -- # val=Yes 00:06:55.322 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.322 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.322 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.322 19:05:13 -- accel/accel.sh@21 -- # val= 00:06:55.322 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.322 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.322 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.322 19:05:13 -- accel/accel.sh@21 -- # val= 00:06:55.322 19:05:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.322 19:05:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.322 19:05:13 -- accel/accel.sh@20 -- # read -r var val 00:06:56.701 19:05:14 -- accel/accel.sh@21 -- # val= 00:06:56.701 19:05:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.701 19:05:14 -- accel/accel.sh@21 -- # val= 00:06:56.701 19:05:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.701 19:05:14 -- accel/accel.sh@21 -- # val= 00:06:56.701 19:05:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.701 19:05:14 -- accel/accel.sh@21 -- # val= 00:06:56.701 19:05:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.701 19:05:14 -- accel/accel.sh@21 -- # val= 00:06:56.701 19:05:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.701 19:05:14 -- accel/accel.sh@21 -- # val= 00:06:56.701 19:05:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.701 19:05:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.701 19:05:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:56.701 19:05:14 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:56.701 19:05:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.701 00:06:56.701 real 0m2.658s 00:06:56.701 user 0m2.399s 00:06:56.701 sys 0m0.257s 00:06:56.701 19:05:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:56.701 19:05:14 -- common/autotest_common.sh@10 -- # set +x 00:06:56.701 ************************************ 00:06:56.702 END TEST accel_xor 00:06:56.702 ************************************ 00:06:56.702 19:05:15 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:56.702 19:05:15 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:56.702 19:05:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:56.702 19:05:15 -- common/autotest_common.sh@10 -- # set +x 00:06:56.702 ************************************ 00:06:56.702 START TEST accel_xor 00:06:56.702 ************************************ 00:06:56.702 19:05:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:06:56.702 19:05:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.702 19:05:15 -- accel/accel.sh@17 -- # local accel_module 00:06:56.702 19:05:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:56.702 19:05:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:56.702 19:05:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.702 19:05:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.702 19:05:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.702 19:05:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.702 19:05:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.702 19:05:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.702 19:05:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.702 19:05:15 -- accel/accel.sh@42 -- # jq -r . 00:06:56.702 [2024-11-18 19:05:15.036476] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:56.702 [2024-11-18 19:05:15.036568] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291796 ] 00:06:56.702 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.702 [2024-11-18 19:05:15.106973] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.702 [2024-11-18 19:05:15.175398] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.082 19:05:16 -- accel/accel.sh@18 -- # out=' 00:06:58.082 SPDK Configuration: 00:06:58.082 Core mask: 0x1 00:06:58.082 00:06:58.082 Accel Perf Configuration: 00:06:58.082 Workload Type: xor 00:06:58.082 Source buffers: 3 00:06:58.082 Transfer size: 4096 bytes 00:06:58.082 Vector count 1 00:06:58.082 Module: software 00:06:58.082 Queue depth: 32 00:06:58.082 Allocate depth: 32 00:06:58.082 # threads/core: 1 00:06:58.082 Run time: 1 seconds 00:06:58.082 Verify: Yes 00:06:58.082 00:06:58.082 Running for 1 seconds... 00:06:58.082 00:06:58.082 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.082 ------------------------------------------------------------------------------------ 00:06:58.082 0,0 669408/s 2614 MiB/s 0 0 00:06:58.082 ==================================================================================== 00:06:58.082 Total 669408/s 2614 MiB/s 0 0' 00:06:58.082 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.082 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.082 19:05:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:58.082 19:05:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:58.082 19:05:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.082 19:05:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.082 19:05:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.082 19:05:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.082 19:05:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.082 19:05:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.082 19:05:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.082 19:05:16 -- accel/accel.sh@42 -- # jq -r . 00:06:58.082 [2024-11-18 19:05:16.367347] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:58.082 [2024-11-18 19:05:16.367431] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1291957 ] 00:06:58.082 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.082 [2024-11-18 19:05:16.438194] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.082 [2024-11-18 19:05:16.505505] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.082 19:05:16 -- accel/accel.sh@21 -- # val= 00:06:58.082 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.082 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.082 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.082 19:05:16 -- accel/accel.sh@21 -- # val= 00:06:58.082 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.082 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.082 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=0x1 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val= 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val= 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=xor 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=3 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val= 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=software 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=32 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=32 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=1 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val=Yes 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val= 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:58.083 19:05:16 -- accel/accel.sh@21 -- # val= 00:06:58.083 19:05:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # IFS=: 00:06:58.083 19:05:16 -- accel/accel.sh@20 -- # read -r var val 00:06:59.463 19:05:17 -- accel/accel.sh@21 -- # val= 00:06:59.463 19:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.463 19:05:17 -- accel/accel.sh@21 -- # val= 00:06:59.463 19:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.463 19:05:17 -- accel/accel.sh@21 -- # val= 00:06:59.463 19:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.463 19:05:17 -- accel/accel.sh@21 -- # val= 00:06:59.463 19:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.463 19:05:17 -- accel/accel.sh@21 -- # val= 00:06:59.463 19:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.463 19:05:17 -- accel/accel.sh@21 -- # val= 00:06:59.463 19:05:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.463 19:05:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.463 19:05:17 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:59.463 19:05:17 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:59.463 19:05:17 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.463 00:06:59.463 real 0m2.661s 00:06:59.463 user 0m2.398s 00:06:59.463 sys 0m0.261s 00:06:59.463 19:05:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:59.463 19:05:17 -- common/autotest_common.sh@10 -- # set +x 00:06:59.463 ************************************ 00:06:59.463 END TEST accel_xor 00:06:59.463 ************************************ 00:06:59.463 19:05:17 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:59.463 19:05:17 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:59.463 19:05:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:59.463 19:05:17 -- common/autotest_common.sh@10 -- # set +x 00:06:59.463 ************************************ 00:06:59.463 START TEST accel_dif_verify 00:06:59.463 ************************************ 00:06:59.463 19:05:17 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:06:59.463 19:05:17 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.463 19:05:17 -- accel/accel.sh@17 -- # local accel_module 00:06:59.463 19:05:17 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:59.463 19:05:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:59.463 19:05:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.463 19:05:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.463 19:05:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.463 19:05:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.463 19:05:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.463 19:05:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.463 19:05:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.463 19:05:17 -- accel/accel.sh@42 -- # jq -r . 00:06:59.463 [2024-11-18 19:05:17.742405] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:59.463 [2024-11-18 19:05:17.742491] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1292221 ] 00:06:59.463 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.463 [2024-11-18 19:05:17.813829] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.463 [2024-11-18 19:05:17.883579] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.841 19:05:19 -- accel/accel.sh@18 -- # out=' 00:07:00.841 SPDK Configuration: 00:07:00.841 Core mask: 0x1 00:07:00.841 00:07:00.841 Accel Perf Configuration: 00:07:00.841 Workload Type: dif_verify 00:07:00.841 Vector size: 4096 bytes 00:07:00.841 Transfer size: 4096 bytes 00:07:00.841 Block size: 512 bytes 00:07:00.841 Metadata size: 8 bytes 00:07:00.841 Vector count 1 00:07:00.841 Module: software 00:07:00.841 Queue depth: 32 00:07:00.841 Allocate depth: 32 00:07:00.841 # threads/core: 1 00:07:00.841 Run time: 1 seconds 00:07:00.841 Verify: No 00:07:00.841 00:07:00.841 Running for 1 seconds... 00:07:00.841 00:07:00.841 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:00.841 ------------------------------------------------------------------------------------ 00:07:00.841 0,0 244768/s 971 MiB/s 0 0 00:07:00.841 ==================================================================================== 00:07:00.841 Total 244768/s 956 MiB/s 0 0' 00:07:00.841 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.841 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.841 19:05:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:00.841 19:05:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:00.841 19:05:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.841 19:05:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.842 19:05:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.842 19:05:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.842 19:05:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.842 19:05:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.842 19:05:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.842 19:05:19 -- accel/accel.sh@42 -- # jq -r . 00:07:00.842 [2024-11-18 19:05:19.072391] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:00.842 [2024-11-18 19:05:19.072479] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1292487 ] 00:07:00.842 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.842 [2024-11-18 19:05:19.140342] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.842 [2024-11-18 19:05:19.206751] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val= 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val= 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val=0x1 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val= 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val= 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val=dif_verify 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val= 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val=software 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@23 -- # accel_module=software 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val=32 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val=32 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val=1 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val=No 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val= 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:00.842 19:05:19 -- accel/accel.sh@21 -- # val= 00:07:00.842 19:05:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # IFS=: 00:07:00.842 19:05:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.779 19:05:20 -- accel/accel.sh@21 -- # val= 00:07:01.779 19:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # IFS=: 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # read -r var val 00:07:01.779 19:05:20 -- accel/accel.sh@21 -- # val= 00:07:01.779 19:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # IFS=: 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # read -r var val 00:07:01.779 19:05:20 -- accel/accel.sh@21 -- # val= 00:07:01.779 19:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # IFS=: 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # read -r var val 00:07:01.779 19:05:20 -- accel/accel.sh@21 -- # val= 00:07:01.779 19:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # IFS=: 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # read -r var val 00:07:01.779 19:05:20 -- accel/accel.sh@21 -- # val= 00:07:01.779 19:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # IFS=: 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # read -r var val 00:07:01.779 19:05:20 -- accel/accel.sh@21 -- # val= 00:07:01.779 19:05:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # IFS=: 00:07:01.779 19:05:20 -- accel/accel.sh@20 -- # read -r var val 00:07:01.779 19:05:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:01.779 19:05:20 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:01.779 19:05:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.779 00:07:01.779 real 0m2.658s 00:07:01.779 user 0m2.403s 00:07:01.779 sys 0m0.254s 00:07:01.779 19:05:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:01.779 19:05:20 -- common/autotest_common.sh@10 -- # set +x 00:07:01.779 ************************************ 00:07:01.779 END TEST accel_dif_verify 00:07:01.779 ************************************ 00:07:02.039 19:05:20 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:02.039 19:05:20 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:02.039 19:05:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.039 19:05:20 -- common/autotest_common.sh@10 -- # set +x 00:07:02.039 ************************************ 00:07:02.039 START TEST accel_dif_generate 00:07:02.039 ************************************ 00:07:02.039 19:05:20 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:07:02.039 19:05:20 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.039 19:05:20 -- accel/accel.sh@17 -- # local accel_module 00:07:02.039 19:05:20 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:02.039 19:05:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:02.040 19:05:20 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.040 19:05:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.040 19:05:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.040 19:05:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.040 19:05:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.040 19:05:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.040 19:05:20 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.040 19:05:20 -- accel/accel.sh@42 -- # jq -r . 00:07:02.040 [2024-11-18 19:05:20.444502] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:02.040 [2024-11-18 19:05:20.444594] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1292780 ] 00:07:02.040 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.040 [2024-11-18 19:05:20.516164] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.040 [2024-11-18 19:05:20.585174] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.422 19:05:21 -- accel/accel.sh@18 -- # out=' 00:07:03.422 SPDK Configuration: 00:07:03.422 Core mask: 0x1 00:07:03.422 00:07:03.422 Accel Perf Configuration: 00:07:03.422 Workload Type: dif_generate 00:07:03.422 Vector size: 4096 bytes 00:07:03.422 Transfer size: 4096 bytes 00:07:03.422 Block size: 512 bytes 00:07:03.422 Metadata size: 8 bytes 00:07:03.422 Vector count 1 00:07:03.422 Module: software 00:07:03.422 Queue depth: 32 00:07:03.422 Allocate depth: 32 00:07:03.422 # threads/core: 1 00:07:03.422 Run time: 1 seconds 00:07:03.422 Verify: No 00:07:03.422 00:07:03.422 Running for 1 seconds... 00:07:03.422 00:07:03.422 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:03.422 ------------------------------------------------------------------------------------ 00:07:03.422 0,0 285984/s 1134 MiB/s 0 0 00:07:03.422 ==================================================================================== 00:07:03.422 Total 285984/s 1117 MiB/s 0 0' 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:03.422 19:05:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:03.422 19:05:21 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.422 19:05:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.422 19:05:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.422 19:05:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.422 19:05:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.422 19:05:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.422 19:05:21 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.422 19:05:21 -- accel/accel.sh@42 -- # jq -r . 00:07:03.422 [2024-11-18 19:05:21.773416] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:03.422 [2024-11-18 19:05:21.773499] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293048 ] 00:07:03.422 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.422 [2024-11-18 19:05:21.842684] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.422 [2024-11-18 19:05:21.909056] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val= 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val= 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val=0x1 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val= 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val= 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val=dif_generate 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val= 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val=software 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val=32 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val=32 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val=1 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val=No 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val= 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:03.422 19:05:21 -- accel/accel.sh@21 -- # val= 00:07:03.422 19:05:21 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # IFS=: 00:07:03.422 19:05:21 -- accel/accel.sh@20 -- # read -r var val 00:07:04.804 19:05:23 -- accel/accel.sh@21 -- # val= 00:07:04.804 19:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # IFS=: 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # read -r var val 00:07:04.804 19:05:23 -- accel/accel.sh@21 -- # val= 00:07:04.804 19:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # IFS=: 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # read -r var val 00:07:04.804 19:05:23 -- accel/accel.sh@21 -- # val= 00:07:04.804 19:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # IFS=: 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # read -r var val 00:07:04.804 19:05:23 -- accel/accel.sh@21 -- # val= 00:07:04.804 19:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # IFS=: 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # read -r var val 00:07:04.804 19:05:23 -- accel/accel.sh@21 -- # val= 00:07:04.804 19:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # IFS=: 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # read -r var val 00:07:04.804 19:05:23 -- accel/accel.sh@21 -- # val= 00:07:04.804 19:05:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # IFS=: 00:07:04.804 19:05:23 -- accel/accel.sh@20 -- # read -r var val 00:07:04.804 19:05:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.804 19:05:23 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:04.804 19:05:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.804 00:07:04.804 real 0m2.657s 00:07:04.804 user 0m2.398s 00:07:04.804 sys 0m0.260s 00:07:04.804 19:05:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:04.804 19:05:23 -- common/autotest_common.sh@10 -- # set +x 00:07:04.804 ************************************ 00:07:04.804 END TEST accel_dif_generate 00:07:04.804 ************************************ 00:07:04.804 19:05:23 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:04.804 19:05:23 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:04.804 19:05:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.804 19:05:23 -- common/autotest_common.sh@10 -- # set +x 00:07:04.804 ************************************ 00:07:04.804 START TEST accel_dif_generate_copy 00:07:04.804 ************************************ 00:07:04.804 19:05:23 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:07:04.804 19:05:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.804 19:05:23 -- accel/accel.sh@17 -- # local accel_module 00:07:04.804 19:05:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:04.804 19:05:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:04.804 19:05:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.804 19:05:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.804 19:05:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.804 19:05:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.804 19:05:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.804 19:05:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.804 19:05:23 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.804 19:05:23 -- accel/accel.sh@42 -- # jq -r . 00:07:04.804 [2024-11-18 19:05:23.143540] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:04.804 [2024-11-18 19:05:23.143617] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293334 ] 00:07:04.804 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.804 [2024-11-18 19:05:23.212668] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.804 [2024-11-18 19:05:23.279633] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.185 19:05:24 -- accel/accel.sh@18 -- # out=' 00:07:06.185 SPDK Configuration: 00:07:06.185 Core mask: 0x1 00:07:06.185 00:07:06.185 Accel Perf Configuration: 00:07:06.185 Workload Type: dif_generate_copy 00:07:06.185 Vector size: 4096 bytes 00:07:06.185 Transfer size: 4096 bytes 00:07:06.185 Vector count 1 00:07:06.185 Module: software 00:07:06.185 Queue depth: 32 00:07:06.185 Allocate depth: 32 00:07:06.185 # threads/core: 1 00:07:06.185 Run time: 1 seconds 00:07:06.185 Verify: No 00:07:06.185 00:07:06.185 Running for 1 seconds... 00:07:06.185 00:07:06.185 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:06.185 ------------------------------------------------------------------------------------ 00:07:06.185 0,0 222112/s 881 MiB/s 0 0 00:07:06.185 ==================================================================================== 00:07:06.185 Total 222112/s 867 MiB/s 0 0' 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:06.185 19:05:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:06.185 19:05:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.185 19:05:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.185 19:05:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.185 19:05:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.185 19:05:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.185 19:05:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.185 19:05:24 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.185 19:05:24 -- accel/accel.sh@42 -- # jq -r . 00:07:06.185 [2024-11-18 19:05:24.467314] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:06.185 [2024-11-18 19:05:24.467400] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293585 ] 00:07:06.185 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.185 [2024-11-18 19:05:24.535150] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.185 [2024-11-18 19:05:24.600785] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val= 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val= 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val=0x1 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val= 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val= 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val= 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val=software 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@23 -- # accel_module=software 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val=32 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.185 19:05:24 -- accel/accel.sh@21 -- # val=32 00:07:06.185 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.185 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.186 19:05:24 -- accel/accel.sh@21 -- # val=1 00:07:06.186 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.186 19:05:24 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:06.186 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.186 19:05:24 -- accel/accel.sh@21 -- # val=No 00:07:06.186 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.186 19:05:24 -- accel/accel.sh@21 -- # val= 00:07:06.186 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.186 19:05:24 -- accel/accel.sh@21 -- # val= 00:07:06.186 19:05:24 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.186 19:05:24 -- accel/accel.sh@20 -- # read -r var val 00:07:07.564 19:05:25 -- accel/accel.sh@21 -- # val= 00:07:07.564 19:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.564 19:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:07.564 19:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:07.564 19:05:25 -- accel/accel.sh@21 -- # val= 00:07:07.564 19:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.564 19:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:07.565 19:05:25 -- accel/accel.sh@21 -- # val= 00:07:07.565 19:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:07.565 19:05:25 -- accel/accel.sh@21 -- # val= 00:07:07.565 19:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:07.565 19:05:25 -- accel/accel.sh@21 -- # val= 00:07:07.565 19:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:07.565 19:05:25 -- accel/accel.sh@21 -- # val= 00:07:07.565 19:05:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # IFS=: 00:07:07.565 19:05:25 -- accel/accel.sh@20 -- # read -r var val 00:07:07.565 19:05:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:07.565 19:05:25 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:07.565 19:05:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.565 00:07:07.565 real 0m2.649s 00:07:07.565 user 0m2.396s 00:07:07.565 sys 0m0.252s 00:07:07.565 19:05:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:07.565 19:05:25 -- common/autotest_common.sh@10 -- # set +x 00:07:07.565 ************************************ 00:07:07.565 END TEST accel_dif_generate_copy 00:07:07.565 ************************************ 00:07:07.565 19:05:25 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:07.565 19:05:25 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.565 19:05:25 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:07.565 19:05:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:07.565 19:05:25 -- common/autotest_common.sh@10 -- # set +x 00:07:07.565 ************************************ 00:07:07.565 START TEST accel_comp 00:07:07.565 ************************************ 00:07:07.565 19:05:25 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.565 19:05:25 -- accel/accel.sh@16 -- # local accel_opc 00:07:07.565 19:05:25 -- accel/accel.sh@17 -- # local accel_module 00:07:07.565 19:05:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.565 19:05:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.565 19:05:25 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.565 19:05:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.565 19:05:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.565 19:05:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.565 19:05:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.565 19:05:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.565 19:05:25 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.565 19:05:25 -- accel/accel.sh@42 -- # jq -r . 00:07:07.565 [2024-11-18 19:05:25.829920] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:07.565 [2024-11-18 19:05:25.830005] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293788 ] 00:07:07.565 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.565 [2024-11-18 19:05:25.900131] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.565 [2024-11-18 19:05:25.967825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.642 19:05:27 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:08.642 00:07:08.642 SPDK Configuration: 00:07:08.642 Core mask: 0x1 00:07:08.642 00:07:08.642 Accel Perf Configuration: 00:07:08.642 Workload Type: compress 00:07:08.642 Transfer size: 4096 bytes 00:07:08.642 Vector count 1 00:07:08.642 Module: software 00:07:08.642 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:08.642 Queue depth: 32 00:07:08.642 Allocate depth: 32 00:07:08.642 # threads/core: 1 00:07:08.642 Run time: 1 seconds 00:07:08.642 Verify: No 00:07:08.642 00:07:08.642 Running for 1 seconds... 00:07:08.642 00:07:08.642 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:08.642 ------------------------------------------------------------------------------------ 00:07:08.642 0,0 67904/s 283 MiB/s 0 0 00:07:08.642 ==================================================================================== 00:07:08.642 Total 67904/s 265 MiB/s 0 0' 00:07:08.642 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.642 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.642 19:05:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:08.642 19:05:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:08.642 19:05:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.642 19:05:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.642 19:05:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.642 19:05:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.642 19:05:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.642 19:05:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.642 19:05:27 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.642 19:05:27 -- accel/accel.sh@42 -- # jq -r . 00:07:08.642 [2024-11-18 19:05:27.160920] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:08.642 [2024-11-18 19:05:27.161024] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1293953 ] 00:07:08.642 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.642 [2024-11-18 19:05:27.231317] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.901 [2024-11-18 19:05:27.299502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=0x1 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=compress 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=software 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@23 -- # accel_module=software 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=32 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=32 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=1 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val=No 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:08.901 19:05:27 -- accel/accel.sh@21 -- # val= 00:07:08.901 19:05:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # IFS=: 00:07:08.901 19:05:27 -- accel/accel.sh@20 -- # read -r var val 00:07:10.280 19:05:28 -- accel/accel.sh@21 -- # val= 00:07:10.280 19:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.280 19:05:28 -- accel/accel.sh@21 -- # val= 00:07:10.280 19:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.280 19:05:28 -- accel/accel.sh@21 -- # val= 00:07:10.280 19:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.280 19:05:28 -- accel/accel.sh@21 -- # val= 00:07:10.280 19:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.280 19:05:28 -- accel/accel.sh@21 -- # val= 00:07:10.280 19:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.280 19:05:28 -- accel/accel.sh@21 -- # val= 00:07:10.280 19:05:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.280 19:05:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.280 19:05:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:10.280 19:05:28 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:10.280 19:05:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.280 00:07:10.280 real 0m2.664s 00:07:10.280 user 0m2.417s 00:07:10.280 sys 0m0.246s 00:07:10.280 19:05:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:10.280 19:05:28 -- common/autotest_common.sh@10 -- # set +x 00:07:10.280 ************************************ 00:07:10.280 END TEST accel_comp 00:07:10.280 ************************************ 00:07:10.280 19:05:28 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:10.280 19:05:28 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:10.280 19:05:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:10.280 19:05:28 -- common/autotest_common.sh@10 -- # set +x 00:07:10.280 ************************************ 00:07:10.280 START TEST accel_decomp 00:07:10.280 ************************************ 00:07:10.280 19:05:28 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:10.280 19:05:28 -- accel/accel.sh@16 -- # local accel_opc 00:07:10.280 19:05:28 -- accel/accel.sh@17 -- # local accel_module 00:07:10.280 19:05:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:10.280 19:05:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:10.280 19:05:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.280 19:05:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.280 19:05:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.280 19:05:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.280 19:05:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.280 19:05:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.280 19:05:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.280 19:05:28 -- accel/accel.sh@42 -- # jq -r . 00:07:10.280 [2024-11-18 19:05:28.537195] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:10.280 [2024-11-18 19:05:28.537291] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1294202 ] 00:07:10.280 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.280 [2024-11-18 19:05:28.607211] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.280 [2024-11-18 19:05:28.674771] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.658 19:05:29 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:11.658 00:07:11.658 SPDK Configuration: 00:07:11.658 Core mask: 0x1 00:07:11.658 00:07:11.658 Accel Perf Configuration: 00:07:11.658 Workload Type: decompress 00:07:11.658 Transfer size: 4096 bytes 00:07:11.658 Vector count 1 00:07:11.658 Module: software 00:07:11.658 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.658 Queue depth: 32 00:07:11.658 Allocate depth: 32 00:07:11.658 # threads/core: 1 00:07:11.658 Run time: 1 seconds 00:07:11.658 Verify: Yes 00:07:11.658 00:07:11.658 Running for 1 seconds... 00:07:11.658 00:07:11.658 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:11.658 ------------------------------------------------------------------------------------ 00:07:11.658 0,0 92192/s 169 MiB/s 0 0 00:07:11.658 ==================================================================================== 00:07:11.658 Total 92192/s 360 MiB/s 0 0' 00:07:11.658 19:05:29 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:29 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:11.658 19:05:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:11.658 19:05:29 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.658 19:05:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.658 19:05:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.658 19:05:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.658 19:05:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.658 19:05:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.658 19:05:29 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.658 19:05:29 -- accel/accel.sh@42 -- # jq -r . 00:07:11.658 [2024-11-18 19:05:29.865927] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:11.658 [2024-11-18 19:05:29.866003] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1294472 ] 00:07:11.658 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.658 [2024-11-18 19:05:29.934157] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.658 [2024-11-18 19:05:30.001149] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=0x1 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=decompress 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=software 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@23 -- # accel_module=software 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=32 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=32 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=1 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val=Yes 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.658 19:05:30 -- accel/accel.sh@21 -- # val= 00:07:11.658 19:05:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.658 19:05:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.595 19:05:31 -- accel/accel.sh@21 -- # val= 00:07:12.595 19:05:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # IFS=: 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # read -r var val 00:07:12.595 19:05:31 -- accel/accel.sh@21 -- # val= 00:07:12.595 19:05:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # IFS=: 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # read -r var val 00:07:12.595 19:05:31 -- accel/accel.sh@21 -- # val= 00:07:12.595 19:05:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # IFS=: 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # read -r var val 00:07:12.595 19:05:31 -- accel/accel.sh@21 -- # val= 00:07:12.595 19:05:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # IFS=: 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # read -r var val 00:07:12.595 19:05:31 -- accel/accel.sh@21 -- # val= 00:07:12.595 19:05:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # IFS=: 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # read -r var val 00:07:12.595 19:05:31 -- accel/accel.sh@21 -- # val= 00:07:12.595 19:05:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # IFS=: 00:07:12.595 19:05:31 -- accel/accel.sh@20 -- # read -r var val 00:07:12.595 19:05:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.595 19:05:31 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:12.595 19:05:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.595 00:07:12.595 real 0m2.659s 00:07:12.595 user 0m2.401s 00:07:12.595 sys 0m0.257s 00:07:12.595 19:05:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:12.595 19:05:31 -- common/autotest_common.sh@10 -- # set +x 00:07:12.595 ************************************ 00:07:12.595 END TEST accel_decomp 00:07:12.595 ************************************ 00:07:12.854 19:05:31 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.854 19:05:31 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:12.854 19:05:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:12.854 19:05:31 -- common/autotest_common.sh@10 -- # set +x 00:07:12.854 ************************************ 00:07:12.854 START TEST accel_decmop_full 00:07:12.854 ************************************ 00:07:12.854 19:05:31 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.854 19:05:31 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.854 19:05:31 -- accel/accel.sh@17 -- # local accel_module 00:07:12.854 19:05:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.854 19:05:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:12.854 19:05:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.854 19:05:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.854 19:05:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.854 19:05:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.854 19:05:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.854 19:05:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.854 19:05:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.854 19:05:31 -- accel/accel.sh@42 -- # jq -r . 00:07:12.854 [2024-11-18 19:05:31.244799] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:12.854 [2024-11-18 19:05:31.244893] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1294754 ] 00:07:12.854 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.854 [2024-11-18 19:05:31.316109] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.854 [2024-11-18 19:05:31.381114] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.233 19:05:32 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:14.233 00:07:14.233 SPDK Configuration: 00:07:14.233 Core mask: 0x1 00:07:14.233 00:07:14.233 Accel Perf Configuration: 00:07:14.233 Workload Type: decompress 00:07:14.233 Transfer size: 111250 bytes 00:07:14.233 Vector count 1 00:07:14.233 Module: software 00:07:14.233 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.233 Queue depth: 32 00:07:14.233 Allocate depth: 32 00:07:14.233 # threads/core: 1 00:07:14.233 Run time: 1 seconds 00:07:14.233 Verify: Yes 00:07:14.233 00:07:14.233 Running for 1 seconds... 00:07:14.233 00:07:14.233 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:14.233 ------------------------------------------------------------------------------------ 00:07:14.233 0,0 5856/s 241 MiB/s 0 0 00:07:14.233 ==================================================================================== 00:07:14.233 Total 5856/s 621 MiB/s 0 0' 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:14.233 19:05:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:14.233 19:05:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.233 19:05:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.233 19:05:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.233 19:05:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.233 19:05:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.233 19:05:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.233 19:05:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.233 19:05:32 -- accel/accel.sh@42 -- # jq -r . 00:07:14.233 [2024-11-18 19:05:32.583545] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:14.233 [2024-11-18 19:05:32.583638] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1295022 ] 00:07:14.233 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.233 [2024-11-18 19:05:32.652280] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.233 [2024-11-18 19:05:32.717851] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=0x1 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=decompress 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=software 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@23 -- # accel_module=software 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=32 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=32 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=1 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val=Yes 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.233 19:05:32 -- accel/accel.sh@21 -- # val= 00:07:14.233 19:05:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.233 19:05:32 -- accel/accel.sh@20 -- # read -r var val 00:07:15.612 19:05:33 -- accel/accel.sh@21 -- # val= 00:07:15.612 19:05:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.612 19:05:33 -- accel/accel.sh@21 -- # val= 00:07:15.612 19:05:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.612 19:05:33 -- accel/accel.sh@21 -- # val= 00:07:15.612 19:05:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.612 19:05:33 -- accel/accel.sh@21 -- # val= 00:07:15.612 19:05:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.612 19:05:33 -- accel/accel.sh@21 -- # val= 00:07:15.612 19:05:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.612 19:05:33 -- accel/accel.sh@21 -- # val= 00:07:15.612 19:05:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.612 19:05:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.612 19:05:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:15.612 19:05:33 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:15.612 19:05:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.612 00:07:15.612 real 0m2.678s 00:07:15.612 user 0m2.416s 00:07:15.612 sys 0m0.258s 00:07:15.612 19:05:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:15.612 19:05:33 -- common/autotest_common.sh@10 -- # set +x 00:07:15.612 ************************************ 00:07:15.612 END TEST accel_decmop_full 00:07:15.612 ************************************ 00:07:15.612 19:05:33 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.612 19:05:33 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:15.612 19:05:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:15.612 19:05:33 -- common/autotest_common.sh@10 -- # set +x 00:07:15.612 ************************************ 00:07:15.612 START TEST accel_decomp_mcore 00:07:15.612 ************************************ 00:07:15.612 19:05:33 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.612 19:05:33 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.612 19:05:33 -- accel/accel.sh@17 -- # local accel_module 00:07:15.612 19:05:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.612 19:05:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:15.612 19:05:33 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.612 19:05:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.612 19:05:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.612 19:05:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.612 19:05:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.612 19:05:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.612 19:05:33 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.612 19:05:33 -- accel/accel.sh@42 -- # jq -r . 00:07:15.612 [2024-11-18 19:05:33.965878] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:15.612 [2024-11-18 19:05:33.965965] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1295309 ] 00:07:15.612 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.612 [2024-11-18 19:05:34.036864] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:15.612 [2024-11-18 19:05:34.106480] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.612 [2024-11-18 19:05:34.106583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:15.612 [2024-11-18 19:05:34.106669] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:15.612 [2024-11-18 19:05:34.106671] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.993 19:05:35 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:16.993 00:07:16.993 SPDK Configuration: 00:07:16.993 Core mask: 0xf 00:07:16.993 00:07:16.993 Accel Perf Configuration: 00:07:16.993 Workload Type: decompress 00:07:16.993 Transfer size: 4096 bytes 00:07:16.993 Vector count 1 00:07:16.993 Module: software 00:07:16.993 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.993 Queue depth: 32 00:07:16.993 Allocate depth: 32 00:07:16.993 # threads/core: 1 00:07:16.993 Run time: 1 seconds 00:07:16.993 Verify: Yes 00:07:16.993 00:07:16.993 Running for 1 seconds... 00:07:16.993 00:07:16.993 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.993 ------------------------------------------------------------------------------------ 00:07:16.993 0,0 75680/s 139 MiB/s 0 0 00:07:16.993 3,0 76032/s 140 MiB/s 0 0 00:07:16.993 2,0 75872/s 139 MiB/s 0 0 00:07:16.993 1,0 75936/s 139 MiB/s 0 0 00:07:16.993 ==================================================================================== 00:07:16.993 Total 303520/s 1185 MiB/s 0 0' 00:07:16.993 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.993 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:16.994 19:05:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:16.994 19:05:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.994 19:05:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.994 19:05:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.994 19:05:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.994 19:05:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.994 19:05:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.994 19:05:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.994 19:05:35 -- accel/accel.sh@42 -- # jq -r . 00:07:16.994 [2024-11-18 19:05:35.309061] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:16.994 [2024-11-18 19:05:35.309144] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1295569 ] 00:07:16.994 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.994 [2024-11-18 19:05:35.380371] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:16.994 [2024-11-18 19:05:35.449499] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.994 [2024-11-18 19:05:35.449594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:16.994 [2024-11-18 19:05:35.449665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:16.994 [2024-11-18 19:05:35.449667] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=0xf 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=decompress 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=software 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=32 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=32 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=1 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val=Yes 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:16.994 19:05:35 -- accel/accel.sh@21 -- # val= 00:07:16.994 19:05:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # IFS=: 00:07:16.994 19:05:35 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@21 -- # val= 00:07:18.372 19:05:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.372 19:05:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.372 19:05:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:18.372 19:05:36 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:18.372 19:05:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.372 00:07:18.372 real 0m2.694s 00:07:18.372 user 0m9.077s 00:07:18.372 sys 0m0.286s 00:07:18.372 19:05:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:18.372 19:05:36 -- common/autotest_common.sh@10 -- # set +x 00:07:18.372 ************************************ 00:07:18.372 END TEST accel_decomp_mcore 00:07:18.372 ************************************ 00:07:18.372 19:05:36 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.372 19:05:36 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:18.372 19:05:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:18.372 19:05:36 -- common/autotest_common.sh@10 -- # set +x 00:07:18.372 ************************************ 00:07:18.372 START TEST accel_decomp_full_mcore 00:07:18.372 ************************************ 00:07:18.372 19:05:36 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.372 19:05:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:18.372 19:05:36 -- accel/accel.sh@17 -- # local accel_module 00:07:18.372 19:05:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.372 19:05:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.372 19:05:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.372 19:05:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.372 19:05:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.372 19:05:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.372 19:05:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.372 19:05:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.372 19:05:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.372 19:05:36 -- accel/accel.sh@42 -- # jq -r . 00:07:18.372 [2024-11-18 19:05:36.708287] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:18.372 [2024-11-18 19:05:36.708374] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1295782 ] 00:07:18.372 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.372 [2024-11-18 19:05:36.778035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:18.372 [2024-11-18 19:05:36.848540] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.372 [2024-11-18 19:05:36.848626] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.372 [2024-11-18 19:05:36.848647] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:18.372 [2024-11-18 19:05:36.848649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.752 19:05:38 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:19.752 00:07:19.752 SPDK Configuration: 00:07:19.752 Core mask: 0xf 00:07:19.752 00:07:19.752 Accel Perf Configuration: 00:07:19.752 Workload Type: decompress 00:07:19.752 Transfer size: 111250 bytes 00:07:19.752 Vector count 1 00:07:19.752 Module: software 00:07:19.752 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.752 Queue depth: 32 00:07:19.752 Allocate depth: 32 00:07:19.752 # threads/core: 1 00:07:19.752 Run time: 1 seconds 00:07:19.752 Verify: Yes 00:07:19.752 00:07:19.752 Running for 1 seconds... 00:07:19.752 00:07:19.752 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:19.752 ------------------------------------------------------------------------------------ 00:07:19.752 0,0 5792/s 239 MiB/s 0 0 00:07:19.752 3,0 5824/s 240 MiB/s 0 0 00:07:19.752 2,0 5824/s 240 MiB/s 0 0 00:07:19.752 1,0 5824/s 240 MiB/s 0 0 00:07:19.752 ==================================================================================== 00:07:19.752 Total 23264/s 2468 MiB/s 0 0' 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:19.752 19:05:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:19.752 19:05:38 -- accel/accel.sh@12 -- # build_accel_config 00:07:19.752 19:05:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:19.752 19:05:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.752 19:05:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.752 19:05:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:19.752 19:05:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:19.752 19:05:38 -- accel/accel.sh@41 -- # local IFS=, 00:07:19.752 19:05:38 -- accel/accel.sh@42 -- # jq -r . 00:07:19.752 [2024-11-18 19:05:38.057304] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:19.752 [2024-11-18 19:05:38.057377] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1295962 ] 00:07:19.752 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.752 [2024-11-18 19:05:38.126081] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:19.752 [2024-11-18 19:05:38.196503] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:19.752 [2024-11-18 19:05:38.196599] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:19.752 [2024-11-18 19:05:38.196622] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:19.752 [2024-11-18 19:05:38.196629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val=0xf 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val=decompress 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val=software 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@23 -- # accel_module=software 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val=32 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val=32 00:07:19.752 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.752 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.752 19:05:38 -- accel/accel.sh@21 -- # val=1 00:07:19.753 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.753 19:05:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:19.753 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.753 19:05:38 -- accel/accel.sh@21 -- # val=Yes 00:07:19.753 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.753 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.753 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:19.753 19:05:38 -- accel/accel.sh@21 -- # val= 00:07:19.753 19:05:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # IFS=: 00:07:19.753 19:05:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.129 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.129 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.129 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.129 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.129 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.129 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.129 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.129 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.129 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.130 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.130 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.130 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.130 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.130 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.130 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.130 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.130 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.130 19:05:39 -- accel/accel.sh@21 -- # val= 00:07:21.130 19:05:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.130 19:05:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.130 19:05:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:21.130 19:05:39 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:21.130 19:05:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:21.130 00:07:21.130 real 0m2.707s 00:07:21.130 user 0m9.139s 00:07:21.130 sys 0m0.278s 00:07:21.130 19:05:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:21.130 19:05:39 -- common/autotest_common.sh@10 -- # set +x 00:07:21.130 ************************************ 00:07:21.130 END TEST accel_decomp_full_mcore 00:07:21.130 ************************************ 00:07:21.130 19:05:39 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:21.130 19:05:39 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:21.130 19:05:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.130 19:05:39 -- common/autotest_common.sh@10 -- # set +x 00:07:21.130 ************************************ 00:07:21.130 START TEST accel_decomp_mthread 00:07:21.130 ************************************ 00:07:21.130 19:05:39 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:21.130 19:05:39 -- accel/accel.sh@16 -- # local accel_opc 00:07:21.130 19:05:39 -- accel/accel.sh@17 -- # local accel_module 00:07:21.130 19:05:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:21.130 19:05:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:21.130 19:05:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.130 19:05:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.130 19:05:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.130 19:05:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.130 19:05:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.130 19:05:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.130 19:05:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.130 19:05:39 -- accel/accel.sh@42 -- # jq -r . 00:07:21.130 [2024-11-18 19:05:39.464334] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:21.130 [2024-11-18 19:05:39.464421] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1296189 ] 00:07:21.130 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.130 [2024-11-18 19:05:39.536696] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.130 [2024-11-18 19:05:39.605256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.509 19:05:40 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:22.509 00:07:22.509 SPDK Configuration: 00:07:22.509 Core mask: 0x1 00:07:22.509 00:07:22.509 Accel Perf Configuration: 00:07:22.509 Workload Type: decompress 00:07:22.509 Transfer size: 4096 bytes 00:07:22.509 Vector count 1 00:07:22.509 Module: software 00:07:22.509 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.509 Queue depth: 32 00:07:22.509 Allocate depth: 32 00:07:22.509 # threads/core: 2 00:07:22.509 Run time: 1 seconds 00:07:22.509 Verify: Yes 00:07:22.509 00:07:22.509 Running for 1 seconds... 00:07:22.509 00:07:22.509 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:22.509 ------------------------------------------------------------------------------------ 00:07:22.509 0,1 44768/s 82 MiB/s 0 0 00:07:22.509 0,0 44672/s 82 MiB/s 0 0 00:07:22.509 ==================================================================================== 00:07:22.509 Total 89440/s 349 MiB/s 0 0' 00:07:22.509 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.509 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.509 19:05:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:22.509 19:05:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:22.509 19:05:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.509 19:05:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.509 19:05:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.509 19:05:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.509 19:05:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.509 19:05:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.509 19:05:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.509 19:05:40 -- accel/accel.sh@42 -- # jq -r . 00:07:22.509 [2024-11-18 19:05:40.800494] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:22.509 [2024-11-18 19:05:40.800601] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1296451 ] 00:07:22.510 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.510 [2024-11-18 19:05:40.870322] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.510 [2024-11-18 19:05:40.937597] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=0x1 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=decompress 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=software 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=32 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=32 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=2 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val=Yes 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.510 19:05:40 -- accel/accel.sh@21 -- # val= 00:07:22.510 19:05:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.510 19:05:40 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@21 -- # val= 00:07:23.889 19:05:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # IFS=: 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@21 -- # val= 00:07:23.889 19:05:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # IFS=: 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@21 -- # val= 00:07:23.889 19:05:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # IFS=: 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@21 -- # val= 00:07:23.889 19:05:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # IFS=: 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@21 -- # val= 00:07:23.889 19:05:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # IFS=: 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@21 -- # val= 00:07:23.889 19:05:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # IFS=: 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@21 -- # val= 00:07:23.889 19:05:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # IFS=: 00:07:23.889 19:05:42 -- accel/accel.sh@20 -- # read -r var val 00:07:23.889 19:05:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:23.889 19:05:42 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:23.889 19:05:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.889 00:07:23.889 real 0m2.671s 00:07:23.889 user 0m2.408s 00:07:23.889 sys 0m0.263s 00:07:23.889 19:05:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:23.889 19:05:42 -- common/autotest_common.sh@10 -- # set +x 00:07:23.889 ************************************ 00:07:23.890 END TEST accel_decomp_mthread 00:07:23.890 ************************************ 00:07:23.890 19:05:42 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.890 19:05:42 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:23.890 19:05:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.890 19:05:42 -- common/autotest_common.sh@10 -- # set +x 00:07:23.890 ************************************ 00:07:23.890 START TEST accel_deomp_full_mthread 00:07:23.890 ************************************ 00:07:23.890 19:05:42 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.890 19:05:42 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.890 19:05:42 -- accel/accel.sh@17 -- # local accel_module 00:07:23.890 19:05:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.890 19:05:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:23.890 19:05:42 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.890 19:05:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.890 19:05:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.890 19:05:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.890 19:05:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.890 19:05:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.890 19:05:42 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.890 19:05:42 -- accel/accel.sh@42 -- # jq -r . 00:07:23.890 [2024-11-18 19:05:42.180148] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:23.890 [2024-11-18 19:05:42.180231] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1296742 ] 00:07:23.890 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.890 [2024-11-18 19:05:42.250245] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.890 [2024-11-18 19:05:42.319733] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.269 19:05:43 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:25.269 00:07:25.269 SPDK Configuration: 00:07:25.269 Core mask: 0x1 00:07:25.269 00:07:25.269 Accel Perf Configuration: 00:07:25.269 Workload Type: decompress 00:07:25.269 Transfer size: 111250 bytes 00:07:25.269 Vector count 1 00:07:25.269 Module: software 00:07:25.269 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:25.269 Queue depth: 32 00:07:25.269 Allocate depth: 32 00:07:25.269 # threads/core: 2 00:07:25.269 Run time: 1 seconds 00:07:25.269 Verify: Yes 00:07:25.269 00:07:25.269 Running for 1 seconds... 00:07:25.269 00:07:25.269 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:25.269 ------------------------------------------------------------------------------------ 00:07:25.269 0,1 2976/s 122 MiB/s 0 0 00:07:25.269 0,0 2944/s 121 MiB/s 0 0 00:07:25.269 ==================================================================================== 00:07:25.269 Total 5920/s 628 MiB/s 0 0' 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.269 19:05:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.269 19:05:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.269 19:05:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.269 19:05:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.269 19:05:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.269 19:05:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.269 19:05:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.269 19:05:43 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.269 19:05:43 -- accel/accel.sh@42 -- # jq -r . 00:07:25.269 [2024-11-18 19:05:43.536228] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:25.269 [2024-11-18 19:05:43.536310] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1297014 ] 00:07:25.269 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.269 [2024-11-18 19:05:43.607631] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.269 [2024-11-18 19:05:43.673555] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val=0x1 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val=decompress 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.269 19:05:43 -- accel/accel.sh@21 -- # val=software 00:07:25.269 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.269 19:05:43 -- accel/accel.sh@23 -- # accel_module=software 00:07:25.269 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val=32 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val=32 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val=2 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val=Yes 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.270 19:05:43 -- accel/accel.sh@21 -- # val= 00:07:25.270 19:05:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.270 19:05:43 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@21 -- # val= 00:07:26.650 19:05:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@21 -- # val= 00:07:26.650 19:05:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@21 -- # val= 00:07:26.650 19:05:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@21 -- # val= 00:07:26.650 19:05:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@21 -- # val= 00:07:26.650 19:05:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@21 -- # val= 00:07:26.650 19:05:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@21 -- # val= 00:07:26.650 19:05:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.650 19:05:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.650 19:05:44 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:26.650 19:05:44 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:26.650 19:05:44 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.650 00:07:26.650 real 0m2.711s 00:07:26.650 user 0m2.451s 00:07:26.650 sys 0m0.256s 00:07:26.650 19:05:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:26.650 19:05:44 -- common/autotest_common.sh@10 -- # set +x 00:07:26.650 ************************************ 00:07:26.650 END TEST accel_deomp_full_mthread 00:07:26.650 ************************************ 00:07:26.650 19:05:44 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:26.650 19:05:44 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:26.650 19:05:44 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:26.650 19:05:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:26.650 19:05:44 -- common/autotest_common.sh@10 -- # set +x 00:07:26.650 19:05:44 -- accel/accel.sh@129 -- # build_accel_config 00:07:26.650 19:05:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.650 19:05:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.650 19:05:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.650 19:05:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.650 19:05:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.650 19:05:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.650 19:05:44 -- accel/accel.sh@42 -- # jq -r . 00:07:26.650 ************************************ 00:07:26.650 START TEST accel_dif_functional_tests 00:07:26.650 ************************************ 00:07:26.650 19:05:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:26.650 [2024-11-18 19:05:44.935590] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:26.650 [2024-11-18 19:05:44.935676] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1297297 ] 00:07:26.650 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.650 [2024-11-18 19:05:45.005944] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:26.650 [2024-11-18 19:05:45.074612] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.650 [2024-11-18 19:05:45.074707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.650 [2024-11-18 19:05:45.074707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:26.650 00:07:26.650 00:07:26.650 CUnit - A unit testing framework for C - Version 2.1-3 00:07:26.650 http://cunit.sourceforge.net/ 00:07:26.650 00:07:26.650 00:07:26.650 Suite: accel_dif 00:07:26.650 Test: verify: DIF generated, GUARD check ...passed 00:07:26.650 Test: verify: DIF generated, APPTAG check ...passed 00:07:26.650 Test: verify: DIF generated, REFTAG check ...passed 00:07:26.650 Test: verify: DIF not generated, GUARD check ...[2024-11-18 19:05:45.143936] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:26.650 [2024-11-18 19:05:45.143984] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:26.650 passed 00:07:26.650 Test: verify: DIF not generated, APPTAG check ...[2024-11-18 19:05:45.144036] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:26.650 [2024-11-18 19:05:45.144055] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:26.650 passed 00:07:26.650 Test: verify: DIF not generated, REFTAG check ...[2024-11-18 19:05:45.144078] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:26.650 [2024-11-18 19:05:45.144097] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:26.650 passed 00:07:26.650 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:26.650 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-18 19:05:45.144143] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:26.650 passed 00:07:26.650 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:26.650 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:26.650 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:26.650 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-18 19:05:45.144245] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:26.650 passed 00:07:26.650 Test: generate copy: DIF generated, GUARD check ...passed 00:07:26.650 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:26.650 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:26.650 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:26.650 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:26.651 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:26.651 Test: generate copy: iovecs-len validate ...[2024-11-18 19:05:45.144431] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:26.651 passed 00:07:26.651 Test: generate copy: buffer alignment validate ...passed 00:07:26.651 00:07:26.651 Run Summary: Type Total Ran Passed Failed Inactive 00:07:26.651 suites 1 1 n/a 0 0 00:07:26.651 tests 20 20 20 0 0 00:07:26.651 asserts 204 204 204 0 n/a 00:07:26.651 00:07:26.651 Elapsed time = 0.000 seconds 00:07:26.910 00:07:26.910 real 0m0.394s 00:07:26.910 user 0m0.587s 00:07:26.910 sys 0m0.163s 00:07:26.910 19:05:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:26.910 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:26.910 ************************************ 00:07:26.910 END TEST accel_dif_functional_tests 00:07:26.910 ************************************ 00:07:26.910 00:07:26.910 real 0m57.061s 00:07:26.910 user 1m4.487s 00:07:26.910 sys 0m7.102s 00:07:26.910 19:05:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:26.910 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:26.910 ************************************ 00:07:26.910 END TEST accel 00:07:26.910 ************************************ 00:07:26.910 19:05:45 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:26.910 19:05:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:26.910 19:05:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:26.910 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:26.910 ************************************ 00:07:26.910 START TEST accel_rpc 00:07:26.910 ************************************ 00:07:26.910 19:05:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:26.910 * Looking for test storage... 00:07:26.910 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:26.910 19:05:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:26.910 19:05:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:26.910 19:05:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:27.170 19:05:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:27.170 19:05:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:27.170 19:05:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:27.170 19:05:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:27.170 19:05:45 -- scripts/common.sh@335 -- # IFS=.-: 00:07:27.170 19:05:45 -- scripts/common.sh@335 -- # read -ra ver1 00:07:27.170 19:05:45 -- scripts/common.sh@336 -- # IFS=.-: 00:07:27.170 19:05:45 -- scripts/common.sh@336 -- # read -ra ver2 00:07:27.170 19:05:45 -- scripts/common.sh@337 -- # local 'op=<' 00:07:27.170 19:05:45 -- scripts/common.sh@339 -- # ver1_l=2 00:07:27.170 19:05:45 -- scripts/common.sh@340 -- # ver2_l=1 00:07:27.170 19:05:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:27.170 19:05:45 -- scripts/common.sh@343 -- # case "$op" in 00:07:27.170 19:05:45 -- scripts/common.sh@344 -- # : 1 00:07:27.170 19:05:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:27.170 19:05:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:27.170 19:05:45 -- scripts/common.sh@364 -- # decimal 1 00:07:27.170 19:05:45 -- scripts/common.sh@352 -- # local d=1 00:07:27.170 19:05:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:27.170 19:05:45 -- scripts/common.sh@354 -- # echo 1 00:07:27.170 19:05:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:27.170 19:05:45 -- scripts/common.sh@365 -- # decimal 2 00:07:27.170 19:05:45 -- scripts/common.sh@352 -- # local d=2 00:07:27.170 19:05:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:27.170 19:05:45 -- scripts/common.sh@354 -- # echo 2 00:07:27.170 19:05:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:27.170 19:05:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:27.170 19:05:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:27.170 19:05:45 -- scripts/common.sh@367 -- # return 0 00:07:27.170 19:05:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:27.170 19:05:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:27.170 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:27.170 --rc genhtml_branch_coverage=1 00:07:27.170 --rc genhtml_function_coverage=1 00:07:27.170 --rc genhtml_legend=1 00:07:27.170 --rc geninfo_all_blocks=1 00:07:27.170 --rc geninfo_unexecuted_blocks=1 00:07:27.170 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:27.170 ' 00:07:27.170 19:05:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:27.170 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:27.170 --rc genhtml_branch_coverage=1 00:07:27.170 --rc genhtml_function_coverage=1 00:07:27.170 --rc genhtml_legend=1 00:07:27.170 --rc geninfo_all_blocks=1 00:07:27.170 --rc geninfo_unexecuted_blocks=1 00:07:27.170 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:27.170 ' 00:07:27.170 19:05:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:27.170 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:27.170 --rc genhtml_branch_coverage=1 00:07:27.170 --rc genhtml_function_coverage=1 00:07:27.170 --rc genhtml_legend=1 00:07:27.170 --rc geninfo_all_blocks=1 00:07:27.170 --rc geninfo_unexecuted_blocks=1 00:07:27.170 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:27.170 ' 00:07:27.170 19:05:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:27.170 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:27.170 --rc genhtml_branch_coverage=1 00:07:27.170 --rc genhtml_function_coverage=1 00:07:27.170 --rc genhtml_legend=1 00:07:27.170 --rc geninfo_all_blocks=1 00:07:27.170 --rc geninfo_unexecuted_blocks=1 00:07:27.170 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:27.170 ' 00:07:27.170 19:05:45 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:27.170 19:05:45 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1297370 00:07:27.170 19:05:45 -- accel/accel_rpc.sh@15 -- # waitforlisten 1297370 00:07:27.170 19:05:45 -- common/autotest_common.sh@829 -- # '[' -z 1297370 ']' 00:07:27.170 19:05:45 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:27.170 19:05:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:27.170 19:05:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:27.170 19:05:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:27.170 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:27.170 19:05:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:27.170 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.170 [2024-11-18 19:05:45.580890] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:27.170 [2024-11-18 19:05:45.580947] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1297370 ] 00:07:27.170 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.171 [2024-11-18 19:05:45.647199] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.171 [2024-11-18 19:05:45.722893] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:27.171 [2024-11-18 19:05:45.723013] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.171 19:05:45 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:27.171 19:05:45 -- common/autotest_common.sh@862 -- # return 0 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:27.171 19:05:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:27.171 19:05:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:27.171 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.171 ************************************ 00:07:27.171 START TEST accel_assign_opcode 00:07:27.171 ************************************ 00:07:27.171 19:05:45 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:27.171 19:05:45 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:27.171 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.171 [2024-11-18 19:05:45.759418] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:27.171 19:05:45 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:27.171 19:05:45 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:27.171 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.171 [2024-11-18 19:05:45.767432] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:27.171 19:05:45 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:27.171 19:05:45 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:27.171 19:05:45 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:27.171 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.430 19:05:45 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:27.430 19:05:45 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:27.430 19:05:45 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:27.430 19:05:45 -- accel/accel_rpc.sh@42 -- # grep software 00:07:27.430 19:05:45 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:27.430 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.430 19:05:45 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:27.430 software 00:07:27.430 00:07:27.430 real 0m0.230s 00:07:27.430 user 0m0.042s 00:07:27.430 sys 0m0.011s 00:07:27.430 19:05:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:27.430 19:05:45 -- common/autotest_common.sh@10 -- # set +x 00:07:27.430 ************************************ 00:07:27.430 END TEST accel_assign_opcode 00:07:27.430 ************************************ 00:07:27.430 19:05:46 -- accel/accel_rpc.sh@55 -- # killprocess 1297370 00:07:27.430 19:05:46 -- common/autotest_common.sh@936 -- # '[' -z 1297370 ']' 00:07:27.430 19:05:46 -- common/autotest_common.sh@940 -- # kill -0 1297370 00:07:27.430 19:05:46 -- common/autotest_common.sh@941 -- # uname 00:07:27.430 19:05:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:27.430 19:05:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1297370 00:07:27.689 19:05:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:27.689 19:05:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:27.689 19:05:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1297370' 00:07:27.689 killing process with pid 1297370 00:07:27.689 19:05:46 -- common/autotest_common.sh@955 -- # kill 1297370 00:07:27.689 19:05:46 -- common/autotest_common.sh@960 -- # wait 1297370 00:07:27.948 00:07:27.948 real 0m0.992s 00:07:27.948 user 0m0.889s 00:07:27.948 sys 0m0.441s 00:07:27.948 19:05:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:27.948 19:05:46 -- common/autotest_common.sh@10 -- # set +x 00:07:27.948 ************************************ 00:07:27.948 END TEST accel_rpc 00:07:27.948 ************************************ 00:07:27.948 19:05:46 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:27.948 19:05:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:27.948 19:05:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:27.948 19:05:46 -- common/autotest_common.sh@10 -- # set +x 00:07:27.948 ************************************ 00:07:27.948 START TEST app_cmdline 00:07:27.948 ************************************ 00:07:27.948 19:05:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:27.948 * Looking for test storage... 00:07:27.948 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:27.948 19:05:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:27.948 19:05:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:27.948 19:05:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:28.208 19:05:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:28.208 19:05:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:28.208 19:05:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:28.208 19:05:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:28.208 19:05:46 -- scripts/common.sh@335 -- # IFS=.-: 00:07:28.208 19:05:46 -- scripts/common.sh@335 -- # read -ra ver1 00:07:28.208 19:05:46 -- scripts/common.sh@336 -- # IFS=.-: 00:07:28.208 19:05:46 -- scripts/common.sh@336 -- # read -ra ver2 00:07:28.208 19:05:46 -- scripts/common.sh@337 -- # local 'op=<' 00:07:28.208 19:05:46 -- scripts/common.sh@339 -- # ver1_l=2 00:07:28.208 19:05:46 -- scripts/common.sh@340 -- # ver2_l=1 00:07:28.208 19:05:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:28.208 19:05:46 -- scripts/common.sh@343 -- # case "$op" in 00:07:28.208 19:05:46 -- scripts/common.sh@344 -- # : 1 00:07:28.208 19:05:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:28.208 19:05:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:28.208 19:05:46 -- scripts/common.sh@364 -- # decimal 1 00:07:28.208 19:05:46 -- scripts/common.sh@352 -- # local d=1 00:07:28.208 19:05:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:28.208 19:05:46 -- scripts/common.sh@354 -- # echo 1 00:07:28.208 19:05:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:28.208 19:05:46 -- scripts/common.sh@365 -- # decimal 2 00:07:28.208 19:05:46 -- scripts/common.sh@352 -- # local d=2 00:07:28.208 19:05:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:28.208 19:05:46 -- scripts/common.sh@354 -- # echo 2 00:07:28.208 19:05:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:28.208 19:05:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:28.208 19:05:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:28.208 19:05:46 -- scripts/common.sh@367 -- # return 0 00:07:28.208 19:05:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:28.208 19:05:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:28.208 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.208 --rc genhtml_branch_coverage=1 00:07:28.208 --rc genhtml_function_coverage=1 00:07:28.208 --rc genhtml_legend=1 00:07:28.208 --rc geninfo_all_blocks=1 00:07:28.208 --rc geninfo_unexecuted_blocks=1 00:07:28.208 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.208 ' 00:07:28.208 19:05:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:28.208 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.208 --rc genhtml_branch_coverage=1 00:07:28.208 --rc genhtml_function_coverage=1 00:07:28.208 --rc genhtml_legend=1 00:07:28.208 --rc geninfo_all_blocks=1 00:07:28.208 --rc geninfo_unexecuted_blocks=1 00:07:28.208 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.208 ' 00:07:28.208 19:05:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:28.208 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.208 --rc genhtml_branch_coverage=1 00:07:28.208 --rc genhtml_function_coverage=1 00:07:28.209 --rc genhtml_legend=1 00:07:28.209 --rc geninfo_all_blocks=1 00:07:28.209 --rc geninfo_unexecuted_blocks=1 00:07:28.209 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.209 ' 00:07:28.209 19:05:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:28.209 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:28.209 --rc genhtml_branch_coverage=1 00:07:28.209 --rc genhtml_function_coverage=1 00:07:28.209 --rc genhtml_legend=1 00:07:28.209 --rc geninfo_all_blocks=1 00:07:28.209 --rc geninfo_unexecuted_blocks=1 00:07:28.209 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:28.209 ' 00:07:28.209 19:05:46 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:28.209 19:05:46 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1297711 00:07:28.209 19:05:46 -- app/cmdline.sh@18 -- # waitforlisten 1297711 00:07:28.209 19:05:46 -- common/autotest_common.sh@829 -- # '[' -z 1297711 ']' 00:07:28.209 19:05:46 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:28.209 19:05:46 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:28.209 19:05:46 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:28.209 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:28.209 19:05:46 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:28.209 19:05:46 -- common/autotest_common.sh@10 -- # set +x 00:07:28.209 19:05:46 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:28.209 [2024-11-18 19:05:46.637254] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:28.209 [2024-11-18 19:05:46.637340] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1297711 ] 00:07:28.209 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.209 [2024-11-18 19:05:46.706013] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.209 [2024-11-18 19:05:46.780998] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:28.209 [2024-11-18 19:05:46.781114] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.146 19:05:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:29.146 19:05:47 -- common/autotest_common.sh@862 -- # return 0 00:07:29.146 19:05:47 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:29.146 { 00:07:29.146 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:29.146 "fields": { 00:07:29.146 "major": 24, 00:07:29.146 "minor": 1, 00:07:29.146 "patch": 1, 00:07:29.146 "suffix": "-pre", 00:07:29.146 "commit": "c13c99a5e" 00:07:29.146 } 00:07:29.146 } 00:07:29.146 19:05:47 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:29.146 19:05:47 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:29.146 19:05:47 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:29.146 19:05:47 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:29.146 19:05:47 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:29.146 19:05:47 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:29.146 19:05:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.146 19:05:47 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:29.146 19:05:47 -- app/cmdline.sh@26 -- # sort 00:07:29.146 19:05:47 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:29.146 19:05:47 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:29.146 19:05:47 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:29.146 19:05:47 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:29.146 19:05:47 -- common/autotest_common.sh@650 -- # local es=0 00:07:29.147 19:05:47 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:29.147 19:05:47 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:29.147 19:05:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:29.147 19:05:47 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:29.147 19:05:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:29.147 19:05:47 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:29.147 19:05:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:29.147 19:05:47 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:29.147 19:05:47 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:29.147 19:05:47 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:29.406 request: 00:07:29.406 { 00:07:29.406 "method": "env_dpdk_get_mem_stats", 00:07:29.406 "req_id": 1 00:07:29.406 } 00:07:29.406 Got JSON-RPC error response 00:07:29.406 response: 00:07:29.406 { 00:07:29.406 "code": -32601, 00:07:29.406 "message": "Method not found" 00:07:29.406 } 00:07:29.406 19:05:47 -- common/autotest_common.sh@653 -- # es=1 00:07:29.406 19:05:47 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:29.406 19:05:47 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:29.406 19:05:47 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:29.406 19:05:47 -- app/cmdline.sh@1 -- # killprocess 1297711 00:07:29.406 19:05:47 -- common/autotest_common.sh@936 -- # '[' -z 1297711 ']' 00:07:29.406 19:05:47 -- common/autotest_common.sh@940 -- # kill -0 1297711 00:07:29.406 19:05:47 -- common/autotest_common.sh@941 -- # uname 00:07:29.406 19:05:47 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:29.406 19:05:47 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1297711 00:07:29.406 19:05:47 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:29.406 19:05:47 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:29.406 19:05:47 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1297711' 00:07:29.406 killing process with pid 1297711 00:07:29.406 19:05:47 -- common/autotest_common.sh@955 -- # kill 1297711 00:07:29.406 19:05:47 -- common/autotest_common.sh@960 -- # wait 1297711 00:07:29.666 00:07:29.666 real 0m1.760s 00:07:29.666 user 0m2.016s 00:07:29.666 sys 0m0.514s 00:07:29.666 19:05:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:29.666 19:05:48 -- common/autotest_common.sh@10 -- # set +x 00:07:29.666 ************************************ 00:07:29.666 END TEST app_cmdline 00:07:29.666 ************************************ 00:07:29.666 19:05:48 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:29.666 19:05:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:29.666 19:05:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.666 19:05:48 -- common/autotest_common.sh@10 -- # set +x 00:07:29.666 ************************************ 00:07:29.666 START TEST version 00:07:29.666 ************************************ 00:07:29.666 19:05:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:29.926 * Looking for test storage... 00:07:29.926 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:29.926 19:05:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:29.926 19:05:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:29.926 19:05:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:29.926 19:05:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:29.926 19:05:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:29.926 19:05:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:29.926 19:05:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:29.926 19:05:48 -- scripts/common.sh@335 -- # IFS=.-: 00:07:29.926 19:05:48 -- scripts/common.sh@335 -- # read -ra ver1 00:07:29.926 19:05:48 -- scripts/common.sh@336 -- # IFS=.-: 00:07:29.926 19:05:48 -- scripts/common.sh@336 -- # read -ra ver2 00:07:29.926 19:05:48 -- scripts/common.sh@337 -- # local 'op=<' 00:07:29.926 19:05:48 -- scripts/common.sh@339 -- # ver1_l=2 00:07:29.926 19:05:48 -- scripts/common.sh@340 -- # ver2_l=1 00:07:29.926 19:05:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:29.926 19:05:48 -- scripts/common.sh@343 -- # case "$op" in 00:07:29.926 19:05:48 -- scripts/common.sh@344 -- # : 1 00:07:29.926 19:05:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:29.926 19:05:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:29.926 19:05:48 -- scripts/common.sh@364 -- # decimal 1 00:07:29.926 19:05:48 -- scripts/common.sh@352 -- # local d=1 00:07:29.926 19:05:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:29.926 19:05:48 -- scripts/common.sh@354 -- # echo 1 00:07:29.926 19:05:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:29.926 19:05:48 -- scripts/common.sh@365 -- # decimal 2 00:07:29.926 19:05:48 -- scripts/common.sh@352 -- # local d=2 00:07:29.926 19:05:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:29.926 19:05:48 -- scripts/common.sh@354 -- # echo 2 00:07:29.926 19:05:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:29.926 19:05:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:29.926 19:05:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:29.926 19:05:48 -- scripts/common.sh@367 -- # return 0 00:07:29.926 19:05:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:29.926 19:05:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:29.926 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.926 --rc genhtml_branch_coverage=1 00:07:29.926 --rc genhtml_function_coverage=1 00:07:29.926 --rc genhtml_legend=1 00:07:29.926 --rc geninfo_all_blocks=1 00:07:29.926 --rc geninfo_unexecuted_blocks=1 00:07:29.926 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.926 ' 00:07:29.926 19:05:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:29.926 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.926 --rc genhtml_branch_coverage=1 00:07:29.926 --rc genhtml_function_coverage=1 00:07:29.926 --rc genhtml_legend=1 00:07:29.926 --rc geninfo_all_blocks=1 00:07:29.926 --rc geninfo_unexecuted_blocks=1 00:07:29.926 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.926 ' 00:07:29.926 19:05:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:29.926 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.926 --rc genhtml_branch_coverage=1 00:07:29.926 --rc genhtml_function_coverage=1 00:07:29.926 --rc genhtml_legend=1 00:07:29.926 --rc geninfo_all_blocks=1 00:07:29.926 --rc geninfo_unexecuted_blocks=1 00:07:29.926 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.926 ' 00:07:29.926 19:05:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:29.926 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.926 --rc genhtml_branch_coverage=1 00:07:29.926 --rc genhtml_function_coverage=1 00:07:29.926 --rc genhtml_legend=1 00:07:29.926 --rc geninfo_all_blocks=1 00:07:29.926 --rc geninfo_unexecuted_blocks=1 00:07:29.926 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.926 ' 00:07:29.926 19:05:48 -- app/version.sh@17 -- # get_header_version major 00:07:29.926 19:05:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:29.926 19:05:48 -- app/version.sh@14 -- # cut -f2 00:07:29.926 19:05:48 -- app/version.sh@14 -- # tr -d '"' 00:07:29.926 19:05:48 -- app/version.sh@17 -- # major=24 00:07:29.926 19:05:48 -- app/version.sh@18 -- # get_header_version minor 00:07:29.926 19:05:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:29.926 19:05:48 -- app/version.sh@14 -- # cut -f2 00:07:29.926 19:05:48 -- app/version.sh@14 -- # tr -d '"' 00:07:29.926 19:05:48 -- app/version.sh@18 -- # minor=1 00:07:29.926 19:05:48 -- app/version.sh@19 -- # get_header_version patch 00:07:29.926 19:05:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:29.926 19:05:48 -- app/version.sh@14 -- # cut -f2 00:07:29.926 19:05:48 -- app/version.sh@14 -- # tr -d '"' 00:07:29.926 19:05:48 -- app/version.sh@19 -- # patch=1 00:07:29.926 19:05:48 -- app/version.sh@20 -- # get_header_version suffix 00:07:29.926 19:05:48 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:29.926 19:05:48 -- app/version.sh@14 -- # cut -f2 00:07:29.927 19:05:48 -- app/version.sh@14 -- # tr -d '"' 00:07:29.927 19:05:48 -- app/version.sh@20 -- # suffix=-pre 00:07:29.927 19:05:48 -- app/version.sh@22 -- # version=24.1 00:07:29.927 19:05:48 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:29.927 19:05:48 -- app/version.sh@25 -- # version=24.1.1 00:07:29.927 19:05:48 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:29.927 19:05:48 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:29.927 19:05:48 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:29.927 19:05:48 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:29.927 19:05:48 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:29.927 00:07:29.927 real 0m0.259s 00:07:29.927 user 0m0.162s 00:07:29.927 sys 0m0.148s 00:07:29.927 19:05:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:29.927 19:05:48 -- common/autotest_common.sh@10 -- # set +x 00:07:29.927 ************************************ 00:07:29.927 END TEST version 00:07:29.927 ************************************ 00:07:30.186 19:05:48 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@191 -- # uname -s 00:07:30.186 19:05:48 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:30.186 19:05:48 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:30.186 19:05:48 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:30.186 19:05:48 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:30.186 19:05:48 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:30.186 19:05:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.186 19:05:48 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:30.186 19:05:48 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:30.186 19:05:48 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:30.186 19:05:48 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:30.186 19:05:48 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:30.186 19:05:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:30.186 19:05:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.186 19:05:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.186 ************************************ 00:07:30.186 START TEST llvm_fuzz 00:07:30.186 ************************************ 00:07:30.186 19:05:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:30.186 * Looking for test storage... 00:07:30.186 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:30.186 19:05:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:30.186 19:05:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:30.186 19:05:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:30.186 19:05:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:30.186 19:05:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:30.186 19:05:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:30.186 19:05:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:30.186 19:05:48 -- scripts/common.sh@335 -- # IFS=.-: 00:07:30.186 19:05:48 -- scripts/common.sh@335 -- # read -ra ver1 00:07:30.186 19:05:48 -- scripts/common.sh@336 -- # IFS=.-: 00:07:30.186 19:05:48 -- scripts/common.sh@336 -- # read -ra ver2 00:07:30.186 19:05:48 -- scripts/common.sh@337 -- # local 'op=<' 00:07:30.186 19:05:48 -- scripts/common.sh@339 -- # ver1_l=2 00:07:30.186 19:05:48 -- scripts/common.sh@340 -- # ver2_l=1 00:07:30.186 19:05:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:30.186 19:05:48 -- scripts/common.sh@343 -- # case "$op" in 00:07:30.186 19:05:48 -- scripts/common.sh@344 -- # : 1 00:07:30.186 19:05:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:30.186 19:05:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:30.186 19:05:48 -- scripts/common.sh@364 -- # decimal 1 00:07:30.186 19:05:48 -- scripts/common.sh@352 -- # local d=1 00:07:30.186 19:05:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:30.186 19:05:48 -- scripts/common.sh@354 -- # echo 1 00:07:30.186 19:05:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:30.186 19:05:48 -- scripts/common.sh@365 -- # decimal 2 00:07:30.186 19:05:48 -- scripts/common.sh@352 -- # local d=2 00:07:30.447 19:05:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:30.447 19:05:48 -- scripts/common.sh@354 -- # echo 2 00:07:30.447 19:05:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:30.447 19:05:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:30.447 19:05:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:30.448 19:05:48 -- scripts/common.sh@367 -- # return 0 00:07:30.448 19:05:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:30.448 19:05:48 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:30.448 19:05:48 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:30.448 19:05:48 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:30.448 19:05:48 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:30.448 19:05:48 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:30.448 19:05:48 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:30.448 19:05:48 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:30.448 19:05:48 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:30.448 19:05:48 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:30.448 19:05:48 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:30.448 19:05:48 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:30.448 19:05:48 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:30.448 19:05:48 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:30.448 19:05:48 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:30.448 19:05:48 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:30.448 19:05:48 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:30.448 19:05:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.448 19:05:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.448 ************************************ 00:07:30.448 START TEST nvmf_fuzz 00:07:30.448 ************************************ 00:07:30.448 19:05:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:30.448 * Looking for test storage... 00:07:30.448 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:30.448 19:05:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:30.448 19:05:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:30.448 19:05:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:30.448 19:05:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:30.448 19:05:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:30.448 19:05:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:30.448 19:05:48 -- scripts/common.sh@335 -- # IFS=.-: 00:07:30.448 19:05:48 -- scripts/common.sh@335 -- # read -ra ver1 00:07:30.448 19:05:48 -- scripts/common.sh@336 -- # IFS=.-: 00:07:30.448 19:05:48 -- scripts/common.sh@336 -- # read -ra ver2 00:07:30.448 19:05:48 -- scripts/common.sh@337 -- # local 'op=<' 00:07:30.448 19:05:48 -- scripts/common.sh@339 -- # ver1_l=2 00:07:30.448 19:05:48 -- scripts/common.sh@340 -- # ver2_l=1 00:07:30.448 19:05:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:30.448 19:05:48 -- scripts/common.sh@343 -- # case "$op" in 00:07:30.448 19:05:48 -- scripts/common.sh@344 -- # : 1 00:07:30.448 19:05:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:30.448 19:05:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:30.448 19:05:48 -- scripts/common.sh@364 -- # decimal 1 00:07:30.448 19:05:48 -- scripts/common.sh@352 -- # local d=1 00:07:30.448 19:05:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:30.448 19:05:48 -- scripts/common.sh@354 -- # echo 1 00:07:30.448 19:05:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:30.448 19:05:48 -- scripts/common.sh@365 -- # decimal 2 00:07:30.448 19:05:48 -- scripts/common.sh@352 -- # local d=2 00:07:30.448 19:05:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:30.448 19:05:48 -- scripts/common.sh@354 -- # echo 2 00:07:30.448 19:05:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:30.448 19:05:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:30.448 19:05:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:30.448 19:05:48 -- scripts/common.sh@367 -- # return 0 00:07:30.448 19:05:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:30.448 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.448 --rc genhtml_branch_coverage=1 00:07:30.448 --rc genhtml_function_coverage=1 00:07:30.448 --rc genhtml_legend=1 00:07:30.448 --rc geninfo_all_blocks=1 00:07:30.448 --rc geninfo_unexecuted_blocks=1 00:07:30.448 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.448 ' 00:07:30.448 19:05:48 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:30.448 19:05:48 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:30.448 19:05:49 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:30.448 19:05:49 -- common/autotest_common.sh@34 -- # set -e 00:07:30.448 19:05:49 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:30.448 19:05:49 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:30.448 19:05:49 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:30.448 19:05:49 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:30.448 19:05:49 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:30.448 19:05:49 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:30.448 19:05:49 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:30.448 19:05:49 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:30.448 19:05:49 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:30.448 19:05:49 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:30.448 19:05:49 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:30.448 19:05:49 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:30.448 19:05:49 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:30.448 19:05:49 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:30.448 19:05:49 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:30.448 19:05:49 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:30.448 19:05:49 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:30.448 19:05:49 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:30.448 19:05:49 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:30.448 19:05:49 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:30.448 19:05:49 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:30.448 19:05:49 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:30.448 19:05:49 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:30.449 19:05:49 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:30.449 19:05:49 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:30.449 19:05:49 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:30.449 19:05:49 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:30.449 19:05:49 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:30.449 19:05:49 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:30.449 19:05:49 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:30.449 19:05:49 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:30.449 19:05:49 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:30.449 19:05:49 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:30.449 19:05:49 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:30.449 19:05:49 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:30.449 19:05:49 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:30.449 19:05:49 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:30.449 19:05:49 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:30.449 19:05:49 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:30.449 19:05:49 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:30.449 19:05:49 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:30.449 19:05:49 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:30.449 19:05:49 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:30.449 19:05:49 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:30.449 19:05:49 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:30.449 19:05:49 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:30.449 19:05:49 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:30.449 19:05:49 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:30.449 19:05:49 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:30.449 19:05:49 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:30.449 19:05:49 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:30.449 19:05:49 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:30.449 19:05:49 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:30.449 19:05:49 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:30.449 19:05:49 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:30.449 19:05:49 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:30.449 19:05:49 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:30.449 19:05:49 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:30.449 19:05:49 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:30.449 19:05:49 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:30.449 19:05:49 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:30.449 19:05:49 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:30.449 19:05:49 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:30.449 19:05:49 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:30.449 19:05:49 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:30.449 19:05:49 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:30.449 19:05:49 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:30.449 19:05:49 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:30.449 19:05:49 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:30.449 19:05:49 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:30.449 19:05:49 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:30.449 19:05:49 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:30.449 19:05:49 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:30.449 19:05:49 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:30.449 19:05:49 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:30.449 19:05:49 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:30.449 19:05:49 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:30.449 19:05:49 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:30.449 19:05:49 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:30.449 19:05:49 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:30.449 19:05:49 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:30.449 19:05:49 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:30.449 19:05:49 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:30.449 19:05:49 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:30.449 19:05:49 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:30.449 19:05:49 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:30.449 19:05:49 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:30.449 19:05:49 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:30.449 19:05:49 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:30.449 19:05:49 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:30.449 19:05:49 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:30.449 19:05:49 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:30.449 19:05:49 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:30.449 19:05:49 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:30.449 19:05:49 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:30.449 19:05:49 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:30.449 19:05:49 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:30.449 19:05:49 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:30.449 19:05:49 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:30.449 #define SPDK_CONFIG_H 00:07:30.449 #define SPDK_CONFIG_APPS 1 00:07:30.449 #define SPDK_CONFIG_ARCH native 00:07:30.449 #undef SPDK_CONFIG_ASAN 00:07:30.449 #undef SPDK_CONFIG_AVAHI 00:07:30.449 #undef SPDK_CONFIG_CET 00:07:30.449 #define SPDK_CONFIG_COVERAGE 1 00:07:30.449 #define SPDK_CONFIG_CROSS_PREFIX 00:07:30.449 #undef SPDK_CONFIG_CRYPTO 00:07:30.449 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:30.449 #undef SPDK_CONFIG_CUSTOMOCF 00:07:30.449 #undef SPDK_CONFIG_DAOS 00:07:30.449 #define SPDK_CONFIG_DAOS_DIR 00:07:30.449 #define SPDK_CONFIG_DEBUG 1 00:07:30.449 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:30.449 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:30.449 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:30.449 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:30.449 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:30.449 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:30.449 #define SPDK_CONFIG_EXAMPLES 1 00:07:30.449 #undef SPDK_CONFIG_FC 00:07:30.449 #define SPDK_CONFIG_FC_PATH 00:07:30.449 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:30.449 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:30.449 #undef SPDK_CONFIG_FUSE 00:07:30.449 #define SPDK_CONFIG_FUZZER 1 00:07:30.449 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:30.449 #undef SPDK_CONFIG_GOLANG 00:07:30.449 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:30.449 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:30.449 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:30.449 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:30.449 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:30.449 #define SPDK_CONFIG_IDXD 1 00:07:30.449 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:30.449 #undef SPDK_CONFIG_IPSEC_MB 00:07:30.449 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:30.449 #define SPDK_CONFIG_ISAL 1 00:07:30.449 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:30.449 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:30.449 #define SPDK_CONFIG_LIBDIR 00:07:30.449 #undef SPDK_CONFIG_LTO 00:07:30.449 #define SPDK_CONFIG_MAX_LCORES 00:07:30.449 #define SPDK_CONFIG_NVME_CUSE 1 00:07:30.449 #undef SPDK_CONFIG_OCF 00:07:30.449 #define SPDK_CONFIG_OCF_PATH 00:07:30.449 #define SPDK_CONFIG_OPENSSL_PATH 00:07:30.449 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:30.449 #undef SPDK_CONFIG_PGO_USE 00:07:30.449 #define SPDK_CONFIG_PREFIX /usr/local 00:07:30.449 #undef SPDK_CONFIG_RAID5F 00:07:30.449 #undef SPDK_CONFIG_RBD 00:07:30.449 #define SPDK_CONFIG_RDMA 1 00:07:30.449 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:30.449 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:30.449 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:30.449 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:30.449 #undef SPDK_CONFIG_SHARED 00:07:30.449 #undef SPDK_CONFIG_SMA 00:07:30.449 #define SPDK_CONFIG_TESTS 1 00:07:30.449 #undef SPDK_CONFIG_TSAN 00:07:30.449 #define SPDK_CONFIG_UBLK 1 00:07:30.449 #define SPDK_CONFIG_UBSAN 1 00:07:30.449 #undef SPDK_CONFIG_UNIT_TESTS 00:07:30.449 #undef SPDK_CONFIG_URING 00:07:30.449 #define SPDK_CONFIG_URING_PATH 00:07:30.449 #undef SPDK_CONFIG_URING_ZNS 00:07:30.449 #undef SPDK_CONFIG_USDT 00:07:30.449 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:30.449 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:30.449 #define SPDK_CONFIG_VFIO_USER 1 00:07:30.449 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:30.449 #define SPDK_CONFIG_VHOST 1 00:07:30.449 #define SPDK_CONFIG_VIRTIO 1 00:07:30.449 #undef SPDK_CONFIG_VTUNE 00:07:30.449 #define SPDK_CONFIG_VTUNE_DIR 00:07:30.449 #define SPDK_CONFIG_WERROR 1 00:07:30.449 #define SPDK_CONFIG_WPDK_DIR 00:07:30.449 #undef SPDK_CONFIG_XNVME 00:07:30.449 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:30.449 19:05:49 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:30.449 19:05:49 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:30.449 19:05:49 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:30.449 19:05:49 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:30.449 19:05:49 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:30.450 19:05:49 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:30.450 19:05:49 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:30.450 19:05:49 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:30.450 19:05:49 -- paths/export.sh@5 -- # export PATH 00:07:30.450 19:05:49 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:30.450 19:05:49 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:30.450 19:05:49 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:30.450 19:05:49 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:30.711 19:05:49 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:30.711 19:05:49 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:30.711 19:05:49 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:30.711 19:05:49 -- pm/common@16 -- # TEST_TAG=N/A 00:07:30.711 19:05:49 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:30.711 19:05:49 -- common/autotest_common.sh@52 -- # : 1 00:07:30.711 19:05:49 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:30.711 19:05:49 -- common/autotest_common.sh@56 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:30.711 19:05:49 -- common/autotest_common.sh@58 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:30.711 19:05:49 -- common/autotest_common.sh@60 -- # : 1 00:07:30.711 19:05:49 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:30.711 19:05:49 -- common/autotest_common.sh@62 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:30.711 19:05:49 -- common/autotest_common.sh@64 -- # : 00:07:30.711 19:05:49 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:30.711 19:05:49 -- common/autotest_common.sh@66 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:30.711 19:05:49 -- common/autotest_common.sh@68 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:30.711 19:05:49 -- common/autotest_common.sh@70 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:30.711 19:05:49 -- common/autotest_common.sh@72 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:30.711 19:05:49 -- common/autotest_common.sh@74 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:30.711 19:05:49 -- common/autotest_common.sh@76 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:30.711 19:05:49 -- common/autotest_common.sh@78 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:30.711 19:05:49 -- common/autotest_common.sh@80 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:30.711 19:05:49 -- common/autotest_common.sh@82 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:30.711 19:05:49 -- common/autotest_common.sh@84 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:30.711 19:05:49 -- common/autotest_common.sh@86 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:30.711 19:05:49 -- common/autotest_common.sh@88 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:30.711 19:05:49 -- common/autotest_common.sh@90 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:30.711 19:05:49 -- common/autotest_common.sh@92 -- # : 1 00:07:30.711 19:05:49 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:30.711 19:05:49 -- common/autotest_common.sh@94 -- # : 1 00:07:30.711 19:05:49 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:30.711 19:05:49 -- common/autotest_common.sh@96 -- # : rdma 00:07:30.711 19:05:49 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:30.711 19:05:49 -- common/autotest_common.sh@98 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:30.711 19:05:49 -- common/autotest_common.sh@100 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:30.711 19:05:49 -- common/autotest_common.sh@102 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:30.711 19:05:49 -- common/autotest_common.sh@104 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:30.711 19:05:49 -- common/autotest_common.sh@106 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:30.711 19:05:49 -- common/autotest_common.sh@108 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:30.711 19:05:49 -- common/autotest_common.sh@110 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:30.711 19:05:49 -- common/autotest_common.sh@112 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:30.711 19:05:49 -- common/autotest_common.sh@114 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:30.711 19:05:49 -- common/autotest_common.sh@116 -- # : 1 00:07:30.711 19:05:49 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:30.711 19:05:49 -- common/autotest_common.sh@118 -- # : 00:07:30.711 19:05:49 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:30.711 19:05:49 -- common/autotest_common.sh@120 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:30.711 19:05:49 -- common/autotest_common.sh@122 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:30.711 19:05:49 -- common/autotest_common.sh@124 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:30.711 19:05:49 -- common/autotest_common.sh@126 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:30.711 19:05:49 -- common/autotest_common.sh@128 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:30.711 19:05:49 -- common/autotest_common.sh@130 -- # : 0 00:07:30.711 19:05:49 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:30.712 19:05:49 -- common/autotest_common.sh@132 -- # : 00:07:30.712 19:05:49 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:30.712 19:05:49 -- common/autotest_common.sh@134 -- # : true 00:07:30.712 19:05:49 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:30.712 19:05:49 -- common/autotest_common.sh@136 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:30.712 19:05:49 -- common/autotest_common.sh@138 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:30.712 19:05:49 -- common/autotest_common.sh@140 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:30.712 19:05:49 -- common/autotest_common.sh@142 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:30.712 19:05:49 -- common/autotest_common.sh@144 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:30.712 19:05:49 -- common/autotest_common.sh@146 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:30.712 19:05:49 -- common/autotest_common.sh@148 -- # : 00:07:30.712 19:05:49 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:30.712 19:05:49 -- common/autotest_common.sh@150 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:30.712 19:05:49 -- common/autotest_common.sh@152 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:30.712 19:05:49 -- common/autotest_common.sh@154 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:30.712 19:05:49 -- common/autotest_common.sh@156 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:30.712 19:05:49 -- common/autotest_common.sh@158 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:30.712 19:05:49 -- common/autotest_common.sh@160 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:30.712 19:05:49 -- common/autotest_common.sh@163 -- # : 00:07:30.712 19:05:49 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:30.712 19:05:49 -- common/autotest_common.sh@165 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:30.712 19:05:49 -- common/autotest_common.sh@167 -- # : 0 00:07:30.712 19:05:49 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:30.712 19:05:49 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:30.712 19:05:49 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:30.712 19:05:49 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:30.712 19:05:49 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:30.712 19:05:49 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:30.712 19:05:49 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:30.712 19:05:49 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:30.712 19:05:49 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:30.712 19:05:49 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:30.712 19:05:49 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:30.712 19:05:49 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:30.712 19:05:49 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:30.712 19:05:49 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:30.712 19:05:49 -- common/autotest_common.sh@196 -- # cat 00:07:30.712 19:05:49 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:30.712 19:05:49 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:30.712 19:05:49 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:30.712 19:05:49 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:30.712 19:05:49 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:30.712 19:05:49 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:30.712 19:05:49 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:30.712 19:05:49 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:30.712 19:05:49 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:30.712 19:05:49 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:30.712 19:05:49 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:30.712 19:05:49 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:30.712 19:05:49 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:30.712 19:05:49 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:30.712 19:05:49 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:30.712 19:05:49 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:30.712 19:05:49 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:30.712 19:05:49 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:30.712 19:05:49 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:30.712 19:05:49 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:30.712 19:05:49 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:30.712 19:05:49 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:30.712 19:05:49 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:30.712 19:05:49 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:30.712 19:05:49 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:30.713 19:05:49 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:30.713 19:05:49 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:30.713 19:05:49 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:30.713 19:05:49 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:30.713 19:05:49 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:30.713 19:05:49 -- common/autotest_common.sh@259 -- # valgrind= 00:07:30.713 19:05:49 -- common/autotest_common.sh@265 -- # uname -s 00:07:30.713 19:05:49 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:30.713 19:05:49 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:30.713 19:05:49 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:30.713 19:05:49 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:30.713 19:05:49 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:30.713 19:05:49 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:30.713 19:05:49 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:30.713 19:05:49 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:30.713 19:05:49 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:30.713 19:05:49 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:30.713 19:05:49 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:30.713 19:05:49 -- common/autotest_common.sh@319 -- # [[ -z 1298225 ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@319 -- # kill -0 1298225 00:07:30.713 19:05:49 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:30.713 19:05:49 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:30.713 19:05:49 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:30.713 19:05:49 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:30.713 19:05:49 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:30.713 19:05:49 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:30.713 19:05:49 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:30.713 19:05:49 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.KWQIpD 00:07:30.713 19:05:49 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:30.713 19:05:49 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.KWQIpD/tests/nvmf /tmp/spdk.KWQIpD 00:07:30.713 19:05:49 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@328 -- # df -T 00:07:30.713 19:05:49 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:30.713 19:05:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:30.713 19:05:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=54453428224 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730582528 00:07:30.713 19:05:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=7277154304 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864031744 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:07:30.713 19:05:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:07:30.713 19:05:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=30865092608 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865293312 00:07:30.713 19:05:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=200704 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:30.713 19:05:49 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:30.713 19:05:49 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:30.713 19:05:49 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:30.713 19:05:49 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:30.713 * Looking for test storage... 00:07:30.713 19:05:49 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:30.713 19:05:49 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:30.713 19:05:49 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:30.713 19:05:49 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:30.713 19:05:49 -- common/autotest_common.sh@373 -- # mount=/ 00:07:30.713 19:05:49 -- common/autotest_common.sh@375 -- # target_space=54453428224 00:07:30.713 19:05:49 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:30.713 19:05:49 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:30.713 19:05:49 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@382 -- # new_size=9491746816 00:07:30.713 19:05:49 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:30.713 19:05:49 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:30.713 19:05:49 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:30.713 19:05:49 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:30.713 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:30.713 19:05:49 -- common/autotest_common.sh@390 -- # return 0 00:07:30.713 19:05:49 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:30.713 19:05:49 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:30.713 19:05:49 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:30.713 19:05:49 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:30.713 19:05:49 -- common/autotest_common.sh@1682 -- # true 00:07:30.713 19:05:49 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:30.713 19:05:49 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@27 -- # exec 00:07:30.713 19:05:49 -- common/autotest_common.sh@29 -- # exec 00:07:30.713 19:05:49 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:30.713 19:05:49 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:30.713 19:05:49 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:30.713 19:05:49 -- common/autotest_common.sh@18 -- # set -x 00:07:30.713 19:05:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:30.713 19:05:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:30.713 19:05:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:30.713 19:05:49 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:30.713 19:05:49 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:30.713 19:05:49 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:30.714 19:05:49 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:30.714 19:05:49 -- scripts/common.sh@335 -- # IFS=.-: 00:07:30.714 19:05:49 -- scripts/common.sh@335 -- # read -ra ver1 00:07:30.714 19:05:49 -- scripts/common.sh@336 -- # IFS=.-: 00:07:30.714 19:05:49 -- scripts/common.sh@336 -- # read -ra ver2 00:07:30.714 19:05:49 -- scripts/common.sh@337 -- # local 'op=<' 00:07:30.714 19:05:49 -- scripts/common.sh@339 -- # ver1_l=2 00:07:30.714 19:05:49 -- scripts/common.sh@340 -- # ver2_l=1 00:07:30.714 19:05:49 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:30.714 19:05:49 -- scripts/common.sh@343 -- # case "$op" in 00:07:30.714 19:05:49 -- scripts/common.sh@344 -- # : 1 00:07:30.714 19:05:49 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:30.714 19:05:49 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:30.714 19:05:49 -- scripts/common.sh@364 -- # decimal 1 00:07:30.714 19:05:49 -- scripts/common.sh@352 -- # local d=1 00:07:30.714 19:05:49 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:30.714 19:05:49 -- scripts/common.sh@354 -- # echo 1 00:07:30.714 19:05:49 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:30.714 19:05:49 -- scripts/common.sh@365 -- # decimal 2 00:07:30.714 19:05:49 -- scripts/common.sh@352 -- # local d=2 00:07:30.714 19:05:49 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:30.714 19:05:49 -- scripts/common.sh@354 -- # echo 2 00:07:30.714 19:05:49 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:30.714 19:05:49 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:30.714 19:05:49 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:30.714 19:05:49 -- scripts/common.sh@367 -- # return 0 00:07:30.714 19:05:49 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:30.714 19:05:49 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:30.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.714 --rc genhtml_branch_coverage=1 00:07:30.714 --rc genhtml_function_coverage=1 00:07:30.714 --rc genhtml_legend=1 00:07:30.714 --rc geninfo_all_blocks=1 00:07:30.714 --rc geninfo_unexecuted_blocks=1 00:07:30.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.714 ' 00:07:30.714 19:05:49 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:30.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.714 --rc genhtml_branch_coverage=1 00:07:30.714 --rc genhtml_function_coverage=1 00:07:30.714 --rc genhtml_legend=1 00:07:30.714 --rc geninfo_all_blocks=1 00:07:30.714 --rc geninfo_unexecuted_blocks=1 00:07:30.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.714 ' 00:07:30.714 19:05:49 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:30.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.714 --rc genhtml_branch_coverage=1 00:07:30.714 --rc genhtml_function_coverage=1 00:07:30.714 --rc genhtml_legend=1 00:07:30.714 --rc geninfo_all_blocks=1 00:07:30.714 --rc geninfo_unexecuted_blocks=1 00:07:30.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.714 ' 00:07:30.714 19:05:49 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:30.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.714 --rc genhtml_branch_coverage=1 00:07:30.714 --rc genhtml_function_coverage=1 00:07:30.714 --rc genhtml_legend=1 00:07:30.714 --rc geninfo_all_blocks=1 00:07:30.714 --rc geninfo_unexecuted_blocks=1 00:07:30.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.714 ' 00:07:30.714 19:05:49 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:30.714 19:05:49 -- ../common.sh@8 -- # pids=() 00:07:30.714 19:05:49 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:30.714 19:05:49 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:30.714 19:05:49 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:30.714 19:05:49 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:30.714 19:05:49 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:30.714 19:05:49 -- nvmf/run.sh@61 -- # mem_size=512 00:07:30.714 19:05:49 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:30.714 19:05:49 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:30.714 19:05:49 -- ../common.sh@69 -- # local fuzz_num=25 00:07:30.714 19:05:49 -- ../common.sh@70 -- # local time=1 00:07:30.714 19:05:49 -- ../common.sh@72 -- # (( i = 0 )) 00:07:30.714 19:05:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:30.714 19:05:49 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:30.714 19:05:49 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:30.714 19:05:49 -- nvmf/run.sh@24 -- # local timen=1 00:07:30.714 19:05:49 -- nvmf/run.sh@25 -- # local core=0x1 00:07:30.714 19:05:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:30.714 19:05:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:30.714 19:05:49 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:30.714 19:05:49 -- nvmf/run.sh@29 -- # port=4400 00:07:30.714 19:05:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:30.714 19:05:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:30.714 19:05:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:30.714 19:05:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:30.714 [2024-11-18 19:05:49.302317] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:30.714 [2024-11-18 19:05:49.302407] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1298467 ] 00:07:30.973 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.973 [2024-11-18 19:05:49.566082] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.231 [2024-11-18 19:05:49.657500] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:31.231 [2024-11-18 19:05:49.657649] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.231 [2024-11-18 19:05:49.715808] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:31.231 [2024-11-18 19:05:49.732148] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:31.231 INFO: Running with entropic power schedule (0xFF, 100). 00:07:31.231 INFO: Seed: 2723842855 00:07:31.232 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:31.232 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:31.232 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:31.232 INFO: A corpus is not provided, starting from an empty corpus 00:07:31.232 #2 INITED exec/s: 0 rss: 60Mb 00:07:31.232 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:31.232 This may also happen if the target rejected all inputs we tried so far 00:07:31.232 [2024-11-18 19:05:49.787250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.232 [2024-11-18 19:05:49.787278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.490 NEW_FUNC[1/671]: 0x43a858 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:31.490 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:31.491 #20 NEW cov: 11552 ft: 11553 corp: 2/109b lim: 320 exec/s: 0 rss: 68Mb L: 108/108 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:31.750 [2024-11-18 19:05:50.108404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.750 [2024-11-18 19:05:50.108480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.750 #26 NEW cov: 11671 ft: 12250 corp: 3/217b lim: 320 exec/s: 0 rss: 68Mb L: 108/108 MS: 1 ChangeBinInt- 00:07:31.750 [2024-11-18 19:05:50.158177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.750 [2024-11-18 19:05:50.158206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.750 #27 NEW cov: 11677 ft: 12567 corp: 4/302b lim: 320 exec/s: 0 rss: 68Mb L: 85/108 MS: 1 EraseBytes- 00:07:31.750 [2024-11-18 19:05:50.198307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.750 [2024-11-18 19:05:50.198332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.750 #28 NEW cov: 11762 ft: 12933 corp: 5/387b lim: 320 exec/s: 0 rss: 68Mb L: 85/108 MS: 1 ShuffleBytes- 00:07:31.750 [2024-11-18 19:05:50.238403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.750 [2024-11-18 19:05:50.238428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.750 #29 NEW cov: 11762 ft: 13013 corp: 6/472b lim: 320 exec/s: 0 rss: 68Mb L: 85/108 MS: 1 EraseBytes- 00:07:31.750 [2024-11-18 19:05:50.268497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.750 [2024-11-18 19:05:50.268522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.750 #30 NEW cov: 11762 ft: 13120 corp: 7/557b lim: 320 exec/s: 0 rss: 68Mb L: 85/108 MS: 1 ChangeByte- 00:07:31.750 [2024-11-18 19:05:50.308615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.750 [2024-11-18 19:05:50.308641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.750 #31 NEW cov: 11762 ft: 13231 corp: 8/669b lim: 320 exec/s: 0 rss: 68Mb L: 112/112 MS: 1 InsertRepeatedBytes- 00:07:31.750 [2024-11-18 19:05:50.348704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:31.750 [2024-11-18 19:05:50.348731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.010 #32 NEW cov: 11762 ft: 13352 corp: 9/754b lim: 320 exec/s: 0 rss: 68Mb L: 85/112 MS: 1 ChangeBinInt- 00:07:32.010 [2024-11-18 19:05:50.388831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.010 [2024-11-18 19:05:50.388856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.010 #33 NEW cov: 11762 ft: 13376 corp: 10/874b lim: 320 exec/s: 0 rss: 68Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:07:32.010 [2024-11-18 19:05:50.428965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.010 [2024-11-18 19:05:50.428990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.010 #39 NEW cov: 11762 ft: 13446 corp: 11/960b lim: 320 exec/s: 0 rss: 68Mb L: 86/120 MS: 1 InsertByte- 00:07:32.010 [2024-11-18 19:05:50.469082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.010 [2024-11-18 19:05:50.469107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.010 #40 NEW cov: 11762 ft: 13525 corp: 12/1080b lim: 320 exec/s: 0 rss: 68Mb L: 120/120 MS: 1 CopyPart- 00:07:32.010 [2024-11-18 19:05:50.509176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.010 [2024-11-18 19:05:50.509201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.010 #41 NEW cov: 11762 ft: 13550 corp: 13/1166b lim: 320 exec/s: 0 rss: 68Mb L: 86/120 MS: 1 ShuffleBytes- 00:07:32.010 [2024-11-18 19:05:50.549266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.010 [2024-11-18 19:05:50.549291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.010 #42 NEW cov: 11762 ft: 13558 corp: 14/1274b lim: 320 exec/s: 0 rss: 69Mb L: 108/120 MS: 1 ChangeBinInt- 00:07:32.010 [2024-11-18 19:05:50.589350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.010 [2024-11-18 19:05:50.589376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.010 #46 NEW cov: 11762 ft: 13618 corp: 15/1350b lim: 320 exec/s: 0 rss: 69Mb L: 76/120 MS: 4 EraseBytes-ShuffleBytes-ShuffleBytes-CopyPart- 00:07:32.270 [2024-11-18 19:05:50.619497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.619522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.270 #47 NEW cov: 11762 ft: 13662 corp: 16/1470b lim: 320 exec/s: 0 rss: 69Mb L: 120/120 MS: 1 ShuffleBytes- 00:07:32.270 [2024-11-18 19:05:50.659596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff5dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.659620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.270 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:32.270 #48 NEW cov: 11785 ft: 13718 corp: 17/1591b lim: 320 exec/s: 0 rss: 69Mb L: 121/121 MS: 1 InsertByte- 00:07:32.270 [2024-11-18 19:05:50.699697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.699721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.270 #49 NEW cov: 11785 ft: 13739 corp: 18/1699b lim: 320 exec/s: 0 rss: 69Mb L: 108/121 MS: 1 ChangeBit- 00:07:32.270 [2024-11-18 19:05:50.729912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.729936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.270 [2024-11-18 19:05:50.730013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.270 [2024-11-18 19:05:50.730027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.270 [2024-11-18 19:05:50.730082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffff2f cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.730095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:32.270 #50 NEW cov: 11785 ft: 13980 corp: 19/1901b lim: 320 exec/s: 0 rss: 69Mb L: 202/202 MS: 1 CrossOver- 00:07:32.270 [2024-11-18 19:05:50.769863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.769887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.270 #51 NEW cov: 11785 ft: 14006 corp: 20/2021b lim: 320 exec/s: 51 rss: 69Mb L: 120/202 MS: 1 ChangeBinInt- 00:07:32.270 [2024-11-18 19:05:50.810109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.810134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.270 [2024-11-18 19:05:50.810194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.810208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.270 #52 NEW cov: 11785 ft: 14228 corp: 21/2151b lim: 320 exec/s: 52 rss: 69Mb L: 130/202 MS: 1 CrossOver- 00:07:32.270 [2024-11-18 19:05:50.850078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.270 [2024-11-18 19:05:50.850103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.529 #53 NEW cov: 11785 ft: 14273 corp: 22/2259b lim: 320 exec/s: 53 rss: 69Mb L: 108/202 MS: 1 ChangeBinInt- 00:07:32.529 [2024-11-18 19:05:50.890255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.529 [2024-11-18 19:05:50.890279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.529 #54 NEW cov: 11785 ft: 14337 corp: 23/2345b lim: 320 exec/s: 54 rss: 69Mb L: 86/202 MS: 1 InsertByte- 00:07:32.530 [2024-11-18 19:05:50.920297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff5dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.530 [2024-11-18 19:05:50.920322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.530 #55 NEW cov: 11785 ft: 14352 corp: 24/2466b lim: 320 exec/s: 55 rss: 69Mb L: 121/202 MS: 1 ChangeBinInt- 00:07:32.530 [2024-11-18 19:05:50.960468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.530 [2024-11-18 19:05:50.960492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.530 #56 NEW cov: 11785 ft: 14414 corp: 25/2587b lim: 320 exec/s: 56 rss: 69Mb L: 121/202 MS: 1 InsertByte- 00:07:32.530 [2024-11-18 19:05:50.990512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.530 [2024-11-18 19:05:50.990536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.530 #57 NEW cov: 11785 ft: 14444 corp: 26/2707b lim: 320 exec/s: 57 rss: 69Mb L: 120/202 MS: 1 ChangeBit- 00:07:32.530 [2024-11-18 19:05:51.030749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.530 [2024-11-18 19:05:51.030777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.530 [2024-11-18 19:05:51.030833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (88) qid:0 cid:5 nsid:88888888 cdw10:88888888 cdw11:88888888 00:07:32.530 [2024-11-18 19:05:51.030847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.530 #58 NEW cov: 11805 ft: 14504 corp: 27/2867b lim: 320 exec/s: 58 rss: 69Mb L: 160/202 MS: 1 InsertRepeatedBytes- 00:07:32.530 [2024-11-18 19:05:51.070750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff5dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.530 [2024-11-18 19:05:51.070774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.530 #64 NEW cov: 11805 ft: 14529 corp: 28/2988b lim: 320 exec/s: 64 rss: 69Mb L: 121/202 MS: 1 ChangeBit- 00:07:32.530 [2024-11-18 19:05:51.110872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xff08000000000000 00:07:32.530 [2024-11-18 19:05:51.110897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.530 #65 NEW cov: 11805 ft: 14565 corp: 29/3108b lim: 320 exec/s: 65 rss: 69Mb L: 120/202 MS: 1 ChangeBinInt- 00:07:32.790 [2024-11-18 19:05:51.151091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.790 [2024-11-18 19:05:51.151116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.790 [2024-11-18 19:05:51.151173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.790 [2024-11-18 19:05:51.151187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.790 #66 NEW cov: 11805 ft: 14571 corp: 30/3257b lim: 320 exec/s: 66 rss: 69Mb L: 149/202 MS: 1 CopyPart- 00:07:32.790 [2024-11-18 19:05:51.191207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.790 [2024-11-18 19:05:51.191231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.790 [2024-11-18 19:05:51.191304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (88) qid:0 cid:5 nsid:88888888 cdw10:88888888 cdw11:88888888 00:07:32.790 [2024-11-18 19:05:51.191318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.790 #67 NEW cov: 11805 ft: 14585 corp: 31/3417b lim: 320 exec/s: 67 rss: 69Mb L: 160/202 MS: 1 ChangeBit- 00:07:32.790 [2024-11-18 19:05:51.231268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff5dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.790 [2024-11-18 19:05:51.231293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.790 #68 NEW cov: 11805 ft: 14619 corp: 32/3538b lim: 320 exec/s: 68 rss: 69Mb L: 121/202 MS: 1 ChangeByte- 00:07:32.790 [2024-11-18 19:05:51.271331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.790 [2024-11-18 19:05:51.271357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.790 #69 NEW cov: 11805 ft: 14634 corp: 33/3624b lim: 320 exec/s: 69 rss: 69Mb L: 86/202 MS: 1 ShuffleBytes- 00:07:32.790 [2024-11-18 19:05:51.311579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.790 [2024-11-18 19:05:51.311603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.790 [2024-11-18 19:05:51.311675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (88) qid:0 cid:5 nsid:88888888 cdw10:88888888 cdw11:88888888 00:07:32.790 [2024-11-18 19:05:51.311689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.790 #70 NEW cov: 11805 ft: 14643 corp: 34/3786b lim: 320 exec/s: 70 rss: 70Mb L: 162/202 MS: 1 CMP- DE: "\377\376"- 00:07:32.790 [2024-11-18 19:05:51.351741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:01010101 cdw11:01010101 SGL TRANSPORT DATA BLOCK TRANSPORT 0x101010101010101 00:07:32.790 [2024-11-18 19:05:51.351765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.790 [2024-11-18 19:05:51.351817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:1010101 cdw10:ffff0101 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.790 [2024-11-18 19:05:51.351831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.790 NEW_FUNC[1/1]: 0x16c4058 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:32.790 #71 NEW cov: 11819 ft: 14978 corp: 35/3969b lim: 320 exec/s: 71 rss: 70Mb L: 183/202 MS: 1 InsertRepeatedBytes- 00:07:33.049 [2024-11-18 19:05:51.401785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.401810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.049 [2024-11-18 19:05:51.401870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.401883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.049 #72 NEW cov: 11819 ft: 14987 corp: 36/4157b lim: 320 exec/s: 72 rss: 70Mb L: 188/202 MS: 1 CrossOver- 00:07:33.049 [2024-11-18 19:05:51.441870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.441895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.049 [2024-11-18 19:05:51.441971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.441985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.049 #73 NEW cov: 11819 ft: 15003 corp: 37/4306b lim: 320 exec/s: 73 rss: 70Mb L: 149/202 MS: 1 PersAutoDict- DE: "\377\376"- 00:07:33.049 [2024-11-18 19:05:51.481890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.481915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.049 #74 NEW cov: 11819 ft: 15012 corp: 38/4414b lim: 320 exec/s: 74 rss: 70Mb L: 108/202 MS: 1 ChangeByte- 00:07:33.049 [2024-11-18 19:05:51.522045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.522077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.049 #75 NEW cov: 11819 ft: 15035 corp: 39/4504b lim: 320 exec/s: 75 rss: 70Mb L: 90/202 MS: 1 CMP- DE: "\001\000\002\000"- 00:07:33.049 [2024-11-18 19:05:51.562217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.562242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.049 [2024-11-18 19:05:51.562318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.562333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.049 #76 NEW cov: 11819 ft: 15042 corp: 40/4694b lim: 320 exec/s: 76 rss: 70Mb L: 190/202 MS: 1 InsertRepeatedBytes- 00:07:33.049 [2024-11-18 19:05:51.602235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff5dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.602260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.049 #77 NEW cov: 11819 ft: 15053 corp: 41/4815b lim: 320 exec/s: 77 rss: 70Mb L: 121/202 MS: 1 ChangeBinInt- 00:07:33.049 [2024-11-18 19:05:51.642383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff5dffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.049 [2024-11-18 19:05:51.642408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.308 #78 NEW cov: 11819 ft: 15067 corp: 42/4936b lim: 320 exec/s: 78 rss: 70Mb L: 121/202 MS: 1 ShuffleBytes- 00:07:33.308 [2024-11-18 19:05:51.682450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.308 [2024-11-18 19:05:51.682475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.308 #79 NEW cov: 11819 ft: 15071 corp: 43/5057b lim: 320 exec/s: 79 rss: 70Mb L: 121/202 MS: 1 InsertByte- 00:07:33.308 [2024-11-18 19:05:51.722559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.308 [2024-11-18 19:05:51.722584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.308 #80 NEW cov: 11819 ft: 15077 corp: 44/5142b lim: 320 exec/s: 80 rss: 70Mb L: 85/202 MS: 1 ChangeBinInt- 00:07:33.308 [2024-11-18 19:05:51.762686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.308 [2024-11-18 19:05:51.762711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.308 #81 NEW cov: 11819 ft: 15085 corp: 45/5227b lim: 320 exec/s: 40 rss: 70Mb L: 85/202 MS: 1 ChangeBinInt- 00:07:33.308 #81 DONE cov: 11819 ft: 15085 corp: 45/5227b lim: 320 exec/s: 40 rss: 70Mb 00:07:33.308 ###### Recommended dictionary. ###### 00:07:33.308 "\377\376" # Uses: 1 00:07:33.308 "\001\000\002\000" # Uses: 0 00:07:33.308 ###### End of recommended dictionary. ###### 00:07:33.308 Done 81 runs in 2 second(s) 00:07:33.308 19:05:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:33.308 19:05:51 -- ../common.sh@72 -- # (( i++ )) 00:07:33.308 19:05:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.308 19:05:51 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:33.308 19:05:51 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:33.308 19:05:51 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.308 19:05:51 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.308 19:05:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:33.308 19:05:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:33.308 19:05:51 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:33.568 19:05:51 -- nvmf/run.sh@29 -- # port=4401 00:07:33.568 19:05:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:33.568 19:05:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:33.568 19:05:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:33.568 19:05:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:33.568 [2024-11-18 19:05:51.945903] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:33.568 [2024-11-18 19:05:51.945998] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1298797 ] 00:07:33.568 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.827 [2024-11-18 19:05:52.202536] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.827 [2024-11-18 19:05:52.287813] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.827 [2024-11-18 19:05:52.287939] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.827 [2024-11-18 19:05:52.346221] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:33.827 [2024-11-18 19:05:52.362555] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:33.827 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.827 INFO: Seed: 1057858919 00:07:33.827 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:33.827 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:33.827 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:33.827 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.827 #2 INITED exec/s: 0 rss: 60Mb 00:07:33.827 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:33.827 This may also happen if the target rejected all inputs we tried so far 00:07:33.827 [2024-11-18 19:05:52.410781] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:33.827 [2024-11-18 19:05:52.411101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.827 [2024-11-18 19:05:52.411130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.827 [2024-11-18 19:05:52.411187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.827 [2024-11-18 19:05:52.411201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.347 NEW_FUNC[1/671]: 0x43b158 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:34.347 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.347 #9 NEW cov: 11655 ft: 11656 corp: 2/17b lim: 30 exec/s: 0 rss: 68Mb L: 16/16 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:34.347 [2024-11-18 19:05:52.711572] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:34.347 [2024-11-18 19:05:52.711726] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.347 [2024-11-18 19:05:52.711833] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:34.347 [2024-11-18 19:05:52.712135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.712167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.712221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.712235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.712286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.712299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.712351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.712365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.347 #15 NEW cov: 11774 ft: 12584 corp: 3/44b lim: 30 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:34.347 [2024-11-18 19:05:52.761672] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:34.347 [2024-11-18 19:05:52.761783] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.347 [2024-11-18 19:05:52.761886] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:34.347 [2024-11-18 19:05:52.761990] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf3 00:07:34.347 [2024-11-18 19:05:52.762197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.762223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.762279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.762293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.762344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.762357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.762410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7f5583a8 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.762423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.347 #21 NEW cov: 11780 ft: 12746 corp: 4/71b lim: 30 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 CMP- DE: "\001\000\177U\250\013{\363"- 00:07:34.347 [2024-11-18 19:05:52.801688] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:34.347 [2024-11-18 19:05:52.801797] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.347 [2024-11-18 19:05:52.802000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.802031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.802086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.802100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.347 #22 NEW cov: 11865 ft: 13109 corp: 5/88b lim: 30 exec/s: 0 rss: 69Mb L: 17/27 MS: 1 CrossOver- 00:07:34.347 [2024-11-18 19:05:52.841874] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.842004] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.842111] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.842215] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.842434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.842461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.842514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.842528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.842582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.842596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.842649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.842663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.347 #27 NEW cov: 11865 ft: 13216 corp: 6/117b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 5 CrossOver-CrossOver-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:34.347 [2024-11-18 19:05:52.881976] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.882105] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.882212] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (280652) > buf size (4096) 00:07:34.347 [2024-11-18 19:05:52.882316] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.882526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.882554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.882606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.882620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.882673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12128112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.882686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.882742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.882756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.347 #28 NEW cov: 11865 ft: 13253 corp: 7/146b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CMP- DE: "\015\000\000\000"- 00:07:34.347 [2024-11-18 19:05:52.922092] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.922222] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.922328] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.922431] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.347 [2024-11-18 19:05:52.922649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.922675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.922737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.922751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.922803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.922817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.347 [2024-11-18 19:05:52.922868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7e120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.347 [2024-11-18 19:05:52.922881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.347 #29 NEW cov: 11865 ft: 13326 corp: 8/175b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ChangeByte- 00:07:34.607 [2024-11-18 19:05:52.962229] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:34.607 [2024-11-18 19:05:52.962339] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.607 [2024-11-18 19:05:52.962461] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:34.607 [2024-11-18 19:05:52.962766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:b8000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:52.962791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:52.962835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:52.962849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:52.962901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:52.962915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:52.962968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:52.962982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.607 #30 NEW cov: 11865 ft: 13409 corp: 9/202b lim: 30 exec/s: 0 rss: 69Mb L: 27/29 MS: 1 ChangeByte- 00:07:34.607 [2024-11-18 19:05:53.002297] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:34.607 [2024-11-18 19:05:53.002427] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.607 [2024-11-18 19:05:53.002534] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:34.607 [2024-11-18 19:05:53.002644] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf3 00:07:34.607 [2024-11-18 19:05:53.002851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:53.002877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:53.002931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:53.002945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:53.002975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:53.002989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:53.003039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7f5583a8 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:53.003053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.607 #31 NEW cov: 11865 ft: 13504 corp: 10/229b lim: 30 exec/s: 0 rss: 69Mb L: 27/29 MS: 1 CopyPart- 00:07:34.607 [2024-11-18 19:05:53.042401] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.607 [2024-11-18 19:05:53.042531] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.607 [2024-11-18 19:05:53.042642] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.607 [2024-11-18 19:05:53.042755] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.607 [2024-11-18 19:05:53.042962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:53.042987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:53.043041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:53.043055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:53.043106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12120202 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.607 [2024-11-18 19:05:53.043119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.607 [2024-11-18 19:05:53.043170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7e120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.043183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.608 #32 NEW cov: 11865 ft: 13528 corp: 11/258b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ChangeBit- 00:07:34.608 [2024-11-18 19:05:53.082540] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.608 [2024-11-18 19:05:53.082676] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.608 [2024-11-18 19:05:53.082791] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (280652) > buf size (4096) 00:07:34.608 [2024-11-18 19:05:53.082892] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.608 [2024-11-18 19:05:53.083094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.083119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.083172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.083186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.083238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12128112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.083252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.083303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:12120200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.083317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.608 #33 NEW cov: 11865 ft: 13553 corp: 12/287b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 ShuffleBytes- 00:07:34.608 [2024-11-18 19:05:53.122666] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.608 [2024-11-18 19:05:53.122804] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.608 [2024-11-18 19:05:53.122910] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000007f 00:07:34.608 [2024-11-18 19:05:53.123009] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f312 00:07:34.608 [2024-11-18 19:05:53.123224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.123249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.123304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.123317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.123370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12128112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.123382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.123434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:55a8830b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.123447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.608 #34 NEW cov: 11865 ft: 13595 corp: 13/316b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 PersAutoDict- DE: "\001\000\177U\250\013{\363"- 00:07:34.608 [2024-11-18 19:05:53.162700] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:34.608 [2024-11-18 19:05:53.163009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.163038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.163093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.163108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.608 #35 NEW cov: 11865 ft: 13697 corp: 14/332b lim: 30 exec/s: 0 rss: 69Mb L: 16/29 MS: 1 CopyPart- 00:07:34.608 [2024-11-18 19:05:53.202890] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:34.608 [2024-11-18 19:05:53.203005] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.608 [2024-11-18 19:05:53.203111] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000001 00:07:34.608 [2024-11-18 19:05:53.203214] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb7b 00:07:34.608 [2024-11-18 19:05:53.203427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a2e0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.203453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.203507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.203521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.203592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.203606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.608 [2024-11-18 19:05:53.203657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:007f0055 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.608 [2024-11-18 19:05:53.203670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.868 #41 NEW cov: 11865 ft: 13795 corp: 15/360b lim: 30 exec/s: 0 rss: 69Mb L: 28/29 MS: 1 InsertByte- 00:07:34.868 [2024-11-18 19:05:53.242916] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (788488) > buf size (4096) 00:07:34.868 [2024-11-18 19:05:53.243139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:02018300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.868 [2024-11-18 19:05:53.243164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.868 #43 NEW cov: 11865 ft: 14212 corp: 16/369b lim: 30 exec/s: 0 rss: 69Mb L: 9/29 MS: 2 ChangeBit-PersAutoDict- DE: "\001\000\177U\250\013{\363"- 00:07:34.868 [2024-11-18 19:05:53.283125] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:34.868 [2024-11-18 19:05:53.283240] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.868 [2024-11-18 19:05:53.283347] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:34.868 [2024-11-18 19:05:53.283450] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf3 00:07:34.868 [2024-11-18 19:05:53.283667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.868 [2024-11-18 19:05:53.283694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.868 [2024-11-18 19:05:53.283746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.868 [2024-11-18 19:05:53.283763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.868 [2024-11-18 19:05:53.283817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.868 [2024-11-18 19:05:53.283830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.868 [2024-11-18 19:05:53.283883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7f5583a8 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.868 [2024-11-18 19:05:53.283896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.868 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.868 #44 NEW cov: 11888 ft: 14258 corp: 17/396b lim: 30 exec/s: 0 rss: 70Mb L: 27/29 MS: 1 PersAutoDict- DE: "\015\000\000\000"- 00:07:34.868 [2024-11-18 19:05:53.323222] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:34.868 [2024-11-18 19:05:53.323337] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.868 [2024-11-18 19:05:53.323446] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:34.868 [2024-11-18 19:05:53.323546] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x7bf3 00:07:34.868 [2024-11-18 19:05:53.323771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.868 [2024-11-18 19:05:53.323798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.868 [2024-11-18 19:05:53.323854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.868 [2024-11-18 19:05:53.323868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.323920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.323933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.323985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:1b000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.323999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.869 #45 NEW cov: 11888 ft: 14269 corp: 18/423b lim: 30 exec/s: 0 rss: 70Mb L: 27/29 MS: 1 ChangeBinInt- 00:07:34.869 [2024-11-18 19:05:53.363370] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.869 [2024-11-18 19:05:53.363484] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:34.869 [2024-11-18 19:05:53.363597] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000007f 00:07:34.869 [2024-11-18 19:05:53.363697] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f312 00:07:34.869 [2024-11-18 19:05:53.363904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.363930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.363982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:121202e7 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.363996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.364050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12128112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.364063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.364114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:55a8830b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.364127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.869 #46 NEW cov: 11888 ft: 14333 corp: 19/452b lim: 30 exec/s: 0 rss: 70Mb L: 29/29 MS: 1 ChangeBinInt- 00:07:34.869 [2024-11-18 19:05:53.403496] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (272752) > buf size (4096) 00:07:34.869 [2024-11-18 19:05:53.403637] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.869 [2024-11-18 19:05:53.403754] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000001 00:07:34.869 [2024-11-18 19:05:53.403857] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x7b 00:07:34.869 [2024-11-18 19:05:53.404061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a5b8100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.404087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.404141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.404155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.404206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.404220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.404271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:001b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.404284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.869 #47 NEW cov: 11888 ft: 14362 corp: 20/480b lim: 30 exec/s: 47 rss: 70Mb L: 28/29 MS: 1 InsertByte- 00:07:34.869 [2024-11-18 19:05:53.453614] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:34.869 [2024-11-18 19:05:53.453729] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:34.869 [2024-11-18 19:05:53.453837] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:34.869 [2024-11-18 19:05:53.453941] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000307b 00:07:34.869 [2024-11-18 19:05:53.454140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.454166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.454220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.454234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.454288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.454307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.869 [2024-11-18 19:05:53.454360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:7f5583a8 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.869 [2024-11-18 19:05:53.454373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.130 #48 NEW cov: 11888 ft: 14385 corp: 21/508b lim: 30 exec/s: 48 rss: 70Mb L: 28/29 MS: 1 InsertByte- 00:07:35.130 [2024-11-18 19:05:53.493655] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:35.130 [2024-11-18 19:05:53.493782] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (13316) > buf size (4096) 00:07:35.130 [2024-11-18 19:05:53.494081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.130 [2024-11-18 19:05:53.494107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.130 [2024-11-18 19:05:53.494161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.130 [2024-11-18 19:05:53.494175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.130 [2024-11-18 19:05:53.494227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.130 [2024-11-18 19:05:53.494241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.130 #49 NEW cov: 11888 ft: 14680 corp: 22/528b lim: 30 exec/s: 49 rss: 70Mb L: 20/29 MS: 1 CMP- DE: "\015\000\000\000"- 00:07:35.130 [2024-11-18 19:05:53.533800] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:35.130 [2024-11-18 19:05:53.533930] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.130 [2024-11-18 19:05:53.534031] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000001 00:07:35.131 [2024-11-18 19:05:53.534135] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb7b 00:07:35.131 [2024-11-18 19:05:53.534337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.534363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.534417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:40ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.534431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.534484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.534497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.534553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:007f0055 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.534567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.131 #50 NEW cov: 11888 ft: 14698 corp: 23/556b lim: 30 exec/s: 50 rss: 70Mb L: 28/29 MS: 1 InsertByte- 00:07:35.131 [2024-11-18 19:05:53.573827] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a80b 00:07:35.131 [2024-11-18 19:05:53.574083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0100817f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.574109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 #51 NEW cov: 11888 ft: 14724 corp: 24/565b lim: 30 exec/s: 51 rss: 70Mb L: 9/29 MS: 1 PersAutoDict- DE: "\001\000\177U\250\013{\363"- 00:07:35.131 [2024-11-18 19:05:53.604031] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:35.131 [2024-11-18 19:05:53.604142] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.131 [2024-11-18 19:05:53.604247] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xfeff 00:07:35.131 [2024-11-18 19:05:53.604347] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000084f3 00:07:35.131 [2024-11-18 19:05:53.604555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.604581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.604635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.604649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.604700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.604713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.604766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:e4ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.604779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.131 #52 NEW cov: 11888 ft: 14768 corp: 25/592b lim: 30 exec/s: 52 rss: 70Mb L: 27/29 MS: 1 ChangeBinInt- 00:07:35.131 [2024-11-18 19:05:53.644128] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.131 [2024-11-18 19:05:53.644240] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.131 [2024-11-18 19:05:53.644347] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (280652) > buf size (4096) 00:07:35.131 [2024-11-18 19:05:53.644450] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.131 [2024-11-18 19:05:53.644679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.644705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.644758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.644772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.644825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12128112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.644839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.644891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e120200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.644908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.131 #53 NEW cov: 11888 ft: 14805 corp: 26/621b lim: 30 exec/s: 53 rss: 70Mb L: 29/29 MS: 1 ChangeBinInt- 00:07:35.131 [2024-11-18 19:05:53.684257] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.131 [2024-11-18 19:05:53.684367] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.131 [2024-11-18 19:05:53.684472] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000007f 00:07:35.131 [2024-11-18 19:05:53.684580] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f312 00:07:35.131 [2024-11-18 19:05:53.684785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.684811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.684866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.684880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.684932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.684945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.684997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:55a8830b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.685010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.131 #54 NEW cov: 11888 ft: 14858 corp: 27/650b lim: 30 exec/s: 54 rss: 70Mb L: 29/29 MS: 1 ChangeByte- 00:07:35.131 [2024-11-18 19:05:53.724312] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:35.131 [2024-11-18 19:05:53.724443] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:35.131 [2024-11-18 19:05:53.724760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a00000d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.724786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.724840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.724854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.131 [2024-11-18 19:05:53.724905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0100001b cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.131 [2024-11-18 19:05:53.724918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.391 #55 NEW cov: 11888 ft: 14887 corp: 28/673b lim: 30 exec/s: 55 rss: 70Mb L: 23/29 MS: 1 EraseBytes- 00:07:35.391 [2024-11-18 19:05:53.764479] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.391 [2024-11-18 19:05:53.764617] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.391 [2024-11-18 19:05:53.764727] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000007f 00:07:35.391 [2024-11-18 19:05:53.764832] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000f312 00:07:35.391 [2024-11-18 19:05:53.765050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.765080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.391 [2024-11-18 19:05:53.765135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.765149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.391 [2024-11-18 19:05:53.765202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12600212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.765215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.391 [2024-11-18 19:05:53.765269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:55a8830b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.765283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.391 #56 NEW cov: 11888 ft: 14895 corp: 29/702b lim: 30 exec/s: 56 rss: 70Mb L: 29/29 MS: 1 ChangeByte- 00:07:35.391 [2024-11-18 19:05:53.804524] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:35.391 [2024-11-18 19:05:53.804749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.804775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.391 #57 NEW cov: 11888 ft: 14907 corp: 30/711b lim: 30 exec/s: 57 rss: 70Mb L: 9/29 MS: 1 EraseBytes- 00:07:35.391 [2024-11-18 19:05:53.844651] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (788488) > buf size (4096) 00:07:35.391 [2024-11-18 19:05:53.844858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:02018300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.844883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.391 #58 NEW cov: 11888 ft: 14945 corp: 31/720b lim: 30 exec/s: 58 rss: 70Mb L: 9/29 MS: 1 ChangeByte- 00:07:35.391 [2024-11-18 19:05:53.884788] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (788488) > buf size (4096) 00:07:35.391 [2024-11-18 19:05:53.885001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:02018300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.885026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.391 #59 NEW cov: 11888 ft: 14958 corp: 32/729b lim: 30 exec/s: 59 rss: 70Mb L: 9/29 MS: 1 CrossOver- 00:07:35.391 [2024-11-18 19:05:53.924937] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:35.391 [2024-11-18 19:05:53.925336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.391 [2024-11-18 19:05:53.925361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.391 [2024-11-18 19:05:53.925416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000d0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.392 [2024-11-18 19:05:53.925430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.392 [2024-11-18 19:05:53.925482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.392 [2024-11-18 19:05:53.925500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.392 #60 NEW cov: 11888 ft: 15015 corp: 33/750b lim: 30 exec/s: 60 rss: 70Mb L: 21/29 MS: 1 InsertByte- 00:07:35.392 [2024-11-18 19:05:53.965083] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.392 [2024-11-18 19:05:53.965212] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200006512 00:07:35.392 [2024-11-18 19:05:53.965319] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (280652) > buf size (4096) 00:07:35.392 [2024-11-18 19:05:53.965426] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.392 [2024-11-18 19:05:53.965643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.392 [2024-11-18 19:05:53.965668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.392 [2024-11-18 19:05:53.965723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.392 [2024-11-18 19:05:53.965737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.392 [2024-11-18 19:05:53.965788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12128112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.392 [2024-11-18 19:05:53.965802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.392 [2024-11-18 19:05:53.965856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e120200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.392 [2024-11-18 19:05:53.965870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.652 #61 NEW cov: 11888 ft: 15024 corp: 34/779b lim: 30 exec/s: 61 rss: 70Mb L: 29/29 MS: 1 ChangeByte- 00:07:35.652 [2024-11-18 19:05:54.005190] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:35.652 [2024-11-18 19:05:54.005299] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.652 [2024-11-18 19:05:54.005404] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000017f 00:07:35.652 [2024-11-18 19:05:54.005506] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xb7b 00:07:35.652 [2024-11-18 19:05:54.005712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a2e0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.005738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.005794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.005808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.005859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff0083ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.005873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.005925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ff550000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.005938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.652 #62 NEW cov: 11888 ft: 15036 corp: 35/807b lim: 30 exec/s: 62 rss: 70Mb L: 28/29 MS: 1 ShuffleBytes- 00:07:35.652 [2024-11-18 19:05:54.045313] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.652 [2024-11-18 19:05:54.045441] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.652 [2024-11-18 19:05:54.045546] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.652 [2024-11-18 19:05:54.045658] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (542796) > buf size (4096) 00:07:35.652 [2024-11-18 19:05:54.045866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.045893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.045949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.045963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.046016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:127e0212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.046030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.046083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.046096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.652 #63 NEW cov: 11888 ft: 15041 corp: 36/831b lim: 30 exec/s: 63 rss: 70Mb L: 24/29 MS: 1 EraseBytes- 00:07:35.652 [2024-11-18 19:05:54.085412] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:35.652 [2024-11-18 19:05:54.085520] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.652 [2024-11-18 19:05:54.085633] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x7f 00:07:35.652 [2024-11-18 19:05:54.085737] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (87716) > buf size (4096) 00:07:35.652 [2024-11-18 19:05:54.085951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.085977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.086031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.086045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.652 [2024-11-18 19:05:54.086098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.652 [2024-11-18 19:05:54.086111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.086163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:55a80000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.086177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.653 #64 NEW cov: 11888 ft: 15049 corp: 37/858b lim: 30 exec/s: 64 rss: 70Mb L: 27/29 MS: 1 CrossOver- 00:07:35.653 [2024-11-18 19:05:54.125540] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (272752) > buf size (4096) 00:07:35.653 [2024-11-18 19:05:54.125674] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (272752) > buf size (4096) 00:07:35.653 [2024-11-18 19:05:54.125797] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.653 [2024-11-18 19:05:54.125900] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000001 00:07:35.653 [2024-11-18 19:05:54.126113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a5b8100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.126139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.126193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0a5b8100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.126207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.126260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.126273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.126324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.126337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.653 #65 NEW cov: 11888 ft: 15063 corp: 38/886b lim: 30 exec/s: 65 rss: 70Mb L: 28/29 MS: 1 CopyPart- 00:07:35.653 [2024-11-18 19:05:54.165684] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.653 [2024-11-18 19:05:54.165795] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200006512 00:07:35.653 [2024-11-18 19:05:54.165898] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (542796) > buf size (4096) 00:07:35.653 [2024-11-18 19:05:54.166000] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.653 [2024-11-18 19:05:54.166216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.166242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.166296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.166309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.166363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12120200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.166376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.166430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e120200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.166443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.653 #66 NEW cov: 11888 ft: 15071 corp: 39/915b lim: 30 exec/s: 66 rss: 70Mb L: 29/29 MS: 1 ShuffleBytes- 00:07:35.653 [2024-11-18 19:05:54.205838] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:35.653 [2024-11-18 19:05:54.205969] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.653 [2024-11-18 19:05:54.206075] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:35.653 [2024-11-18 19:05:54.206179] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf3 00:07:35.653 [2024-11-18 19:05:54.206393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.206421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.206477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.206491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.206543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.206564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.206618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83a8 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.206632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.653 #67 NEW cov: 11888 ft: 15094 corp: 40/942b lim: 30 exec/s: 67 rss: 70Mb L: 27/29 MS: 1 CopyPart- 00:07:35.653 [2024-11-18 19:05:54.245912] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:35.653 [2024-11-18 19:05:54.246041] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.653 [2024-11-18 19:05:54.246147] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:35.653 [2024-11-18 19:05:54.246253] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007bf3 00:07:35.653 [2024-11-18 19:05:54.246473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.246500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.246560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.246576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.246629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00ff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.246642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.653 [2024-11-18 19:05:54.246706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83a8 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.653 [2024-11-18 19:05:54.246719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.913 #68 NEW cov: 11888 ft: 15105 corp: 41/969b lim: 30 exec/s: 68 rss: 70Mb L: 27/29 MS: 1 ShuffleBytes- 00:07:35.913 [2024-11-18 19:05:54.286079] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.286210] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.286318] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.286422] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.286633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.286659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.286730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.286744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.286797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.286811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.286863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.286877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.914 #69 NEW cov: 11888 ft: 15114 corp: 42/998b lim: 30 exec/s: 69 rss: 70Mb L: 29/29 MS: 1 ShuffleBytes- 00:07:35.914 [2024-11-18 19:05:54.326159] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.326287] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.326394] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (280668) > buf size (4096) 00:07:35.914 [2024-11-18 19:05:54.326498] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.326726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.326753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.326807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.326822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.326874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12168112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.326887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.326941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.326955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.366287] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.366417] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.366526] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (280668) > buf size (4096) 00:07:35.914 [2024-11-18 19:05:54.366635] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.366858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.366883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.366938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.366955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.367009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12168112 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.367023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.367074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.367087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.914 #71 NEW cov: 11888 ft: 15117 corp: 43/1027b lim: 30 exec/s: 71 rss: 70Mb L: 29/29 MS: 2 ChangeBit-ShuffleBytes- 00:07:35.914 [2024-11-18 19:05:54.406393] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001212 00:07:35.914 [2024-11-18 19:05:54.406523] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200006512 00:07:35.914 [2024-11-18 19:05:54.406637] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (542796) > buf size (4096) 00:07:35.914 [2024-11-18 19:05:54.406752] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (14412) > buf size (4096) 00:07:35.914 [2024-11-18 19:05:54.406953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.406979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.407035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:12120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.407048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.407102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:12120200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.407115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.914 [2024-11-18 19:05:54.407168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e120000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.914 [2024-11-18 19:05:54.407181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.914 #72 NEW cov: 11888 ft: 15127 corp: 44/1056b lim: 30 exec/s: 36 rss: 70Mb L: 29/29 MS: 1 ChangeBinInt- 00:07:35.914 #72 DONE cov: 11888 ft: 15127 corp: 44/1056b lim: 30 exec/s: 36 rss: 70Mb 00:07:35.914 ###### Recommended dictionary. ###### 00:07:35.914 "\001\000\177U\250\013{\363" # Uses: 3 00:07:35.914 "\015\000\000\000" # Uses: 1 00:07:35.914 ###### End of recommended dictionary. ###### 00:07:35.914 Done 72 runs in 2 second(s) 00:07:36.174 19:05:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:36.174 19:05:54 -- ../common.sh@72 -- # (( i++ )) 00:07:36.174 19:05:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.174 19:05:54 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:36.174 19:05:54 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:36.174 19:05:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:36.174 19:05:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:36.174 19:05:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:36.174 19:05:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:36.174 19:05:54 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:36.174 19:05:54 -- nvmf/run.sh@29 -- # port=4402 00:07:36.174 19:05:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:36.174 19:05:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:36.174 19:05:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:36.174 19:05:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:36.174 [2024-11-18 19:05:54.598612] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:36.174 [2024-11-18 19:05:54.598686] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1299302 ] 00:07:36.174 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.434 [2024-11-18 19:05:54.848253] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.434 [2024-11-18 19:05:54.939876] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.434 [2024-11-18 19:05:54.940001] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.434 [2024-11-18 19:05:54.998445] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:36.434 [2024-11-18 19:05:55.014783] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:36.434 INFO: Running with entropic power schedule (0xFF, 100). 00:07:36.434 INFO: Seed: 3712883817 00:07:36.693 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:36.693 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:36.693 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:36.693 INFO: A corpus is not provided, starting from an empty corpus 00:07:36.693 #2 INITED exec/s: 0 rss: 61Mb 00:07:36.693 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:36.693 This may also happen if the target rejected all inputs we tried so far 00:07:36.693 [2024-11-18 19:05:55.092040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.693 [2024-11-18 19:05:55.092077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.693 [2024-11-18 19:05:55.092161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.693 [2024-11-18 19:05:55.092177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.693 [2024-11-18 19:05:55.092252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.693 [2024-11-18 19:05:55.092268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.953 NEW_FUNC[1/670]: 0x43db78 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:36.953 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:36.953 #13 NEW cov: 11574 ft: 11571 corp: 2/28b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:36.953 [2024-11-18 19:05:55.422109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.953 [2024-11-18 19:05:55.422156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.953 [2024-11-18 19:05:55.422283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.953 [2024-11-18 19:05:55.422314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.953 [2024-11-18 19:05:55.422446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.953 [2024-11-18 19:05:55.422467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.953 #14 NEW cov: 11693 ft: 12270 corp: 3/55b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 CopyPart- 00:07:36.953 [2024-11-18 19:05:55.481603] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.953 [2024-11-18 19:05:55.481970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.953 [2024-11-18 19:05:55.481999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.953 [2024-11-18 19:05:55.482127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.953 [2024-11-18 19:05:55.482150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.953 #22 NEW cov: 11708 ft: 12619 corp: 4/74b lim: 35 exec/s: 0 rss: 68Mb L: 19/27 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:36.953 [2024-11-18 19:05:55.521591] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:36.953 [2024-11-18 19:05:55.521972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.953 [2024-11-18 19:05:55.522001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.953 [2024-11-18 19:05:55.522281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.953 [2024-11-18 19:05:55.522306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.953 #23 NEW cov: 11793 ft: 13308 corp: 5/95b lim: 35 exec/s: 0 rss: 68Mb L: 21/27 MS: 1 CMP- DE: "\001\000"- 00:07:37.213 [2024-11-18 19:05:55.572483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.572514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.572636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.572653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.572776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:3f00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.572792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.213 #24 NEW cov: 11793 ft: 13389 corp: 6/122b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 ChangeByte- 00:07:37.213 [2024-11-18 19:05:55.611823] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:37.213 [2024-11-18 19:05:55.612160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.612186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.612449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.612471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.213 #25 NEW cov: 11793 ft: 13489 corp: 7/144b lim: 35 exec/s: 0 rss: 68Mb L: 22/27 MS: 1 InsertByte- 00:07:37.213 [2024-11-18 19:05:55.662957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.662983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.663106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.663124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.663242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.663260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.663384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.663400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.213 #26 NEW cov: 11793 ft: 14068 corp: 8/173b lim: 35 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:37.213 [2024-11-18 19:05:55.702886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.702914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.703032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.703062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.703183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.703200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.213 #27 NEW cov: 11793 ft: 14249 corp: 9/200b lim: 35 exec/s: 0 rss: 69Mb L: 27/29 MS: 1 ChangeBit- 00:07:37.213 [2024-11-18 19:05:55.742703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff002c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.742730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.742850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.742867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.213 #31 NEW cov: 11793 ft: 14280 corp: 10/215b lim: 35 exec/s: 0 rss: 69Mb L: 15/29 MS: 4 ChangeByte-ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:37.213 [2024-11-18 19:05:55.783232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.783259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.783390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.783407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.783519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.783537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.213 [2024-11-18 19:05:55.783657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.213 [2024-11-18 19:05:55.783674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.213 #32 NEW cov: 11793 ft: 14374 corp: 11/244b lim: 35 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CopyPart- 00:07:37.473 [2024-11-18 19:05:55.843456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:55.843484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.473 [2024-11-18 19:05:55.843615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:55.843633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.473 [2024-11-18 19:05:55.843764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0070 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:55.843782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.473 [2024-11-18 19:05:55.843909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:55.843926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.473 #33 NEW cov: 11793 ft: 14482 corp: 12/274b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 InsertByte- 00:07:37.473 [2024-11-18 19:05:55.902927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff002c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:55.902955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.473 #34 NEW cov: 11793 ft: 14817 corp: 13/283b lim: 35 exec/s: 0 rss: 69Mb L: 9/30 MS: 1 EraseBytes- 00:07:37.473 [2024-11-18 19:05:55.953031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff002c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:55.953057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.473 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:37.473 #35 NEW cov: 11816 ft: 14876 corp: 14/292b lim: 35 exec/s: 0 rss: 69Mb L: 9/30 MS: 1 ChangeBit- 00:07:37.473 [2024-11-18 19:05:56.003525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:c0c0000a cdw11:c000c0c0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:56.003557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.473 [2024-11-18 19:05:56.003686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c0c000c0 cdw11:c000c0c0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:56.003707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.473 [2024-11-18 19:05:56.003835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c0c000c0 cdw11:c000c0c0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:56.003853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.473 #38 NEW cov: 11816 ft: 14887 corp: 15/317b lim: 35 exec/s: 0 rss: 69Mb L: 25/30 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:37.473 [2024-11-18 19:05:56.053916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:56.053945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.473 [2024-11-18 19:05:56.054070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:fe00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:56.054089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.473 [2024-11-18 19:05:56.054190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c36d0011 cdw11:0000b28b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.473 [2024-11-18 19:05:56.054207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.733 #39 NEW cov: 11816 ft: 14928 corp: 16/344b lim: 35 exec/s: 39 rss: 69Mb L: 27/30 MS: 1 CMP- DE: "\376n\021\303m\262\213\000"- 00:07:37.733 [2024-11-18 19:05:56.103688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.733 [2024-11-18 19:05:56.103715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.733 [2024-11-18 19:05:56.103847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.733 [2024-11-18 19:05:56.103865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.733 [2024-11-18 19:05:56.103993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0034 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.733 [2024-11-18 19:05:56.104010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.733 #45 NEW cov: 11816 ft: 14953 corp: 17/371b lim: 35 exec/s: 45 rss: 69Mb L: 27/30 MS: 1 ChangeByte- 00:07:37.733 [2024-11-18 19:05:56.144047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.733 [2024-11-18 19:05:56.144076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.733 [2024-11-18 19:05:56.144199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:fe00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.733 [2024-11-18 19:05:56.144216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.144338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c36d0081 cdw11:0000b28b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.144358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.734 #46 NEW cov: 11816 ft: 14993 corp: 18/398b lim: 35 exec/s: 46 rss: 69Mb L: 27/30 MS: 1 ChangeByte- 00:07:37.734 [2024-11-18 19:05:56.194236] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:37.734 [2024-11-18 19:05:56.194623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.194651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.194769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dfdf00df cdw11:df00dfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.194786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.194917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:df0000df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.194932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.195062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.195086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.734 #52 NEW cov: 11816 ft: 15091 corp: 19/427b lim: 35 exec/s: 52 rss: 69Mb L: 29/30 MS: 1 InsertRepeatedBytes- 00:07:37.734 [2024-11-18 19:05:56.244609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.244635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.244776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.244793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.244926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.244941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.245061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.245077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.734 #53 NEW cov: 11816 ft: 15166 corp: 20/455b lim: 35 exec/s: 53 rss: 69Mb L: 28/30 MS: 1 CrossOver- 00:07:37.734 [2024-11-18 19:05:56.304632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.304661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.304787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.304807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.734 [2024-11-18 19:05:56.304939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:2500ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.734 [2024-11-18 19:05:56.304957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.734 #54 NEW cov: 11816 ft: 15214 corp: 21/482b lim: 35 exec/s: 54 rss: 69Mb L: 27/30 MS: 1 ChangeByte- 00:07:37.993 [2024-11-18 19:05:56.354810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.993 [2024-11-18 19:05:56.354838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.993 [2024-11-18 19:05:56.354956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:fe00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.993 [2024-11-18 19:05:56.354975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.993 [2024-11-18 19:05:56.355097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c36d0011 cdw11:0000b28b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.993 [2024-11-18 19:05:56.355113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.993 #55 NEW cov: 11816 ft: 15222 corp: 22/509b lim: 35 exec/s: 55 rss: 69Mb L: 27/30 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:37.993 [2024-11-18 19:05:56.405180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.993 [2024-11-18 19:05:56.405208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.993 [2024-11-18 19:05:56.405348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:1100fe6e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.993 [2024-11-18 19:05:56.405367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.993 [2024-11-18 19:05:56.405496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b28b006d cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.993 [2024-11-18 19:05:56.405513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.405639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.405656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.994 #56 NEW cov: 11816 ft: 15281 corp: 23/538b lim: 35 exec/s: 56 rss: 69Mb L: 29/30 MS: 1 PersAutoDict- DE: "\376n\021\303m\262\213\000"- 00:07:37.994 [2024-11-18 19:05:56.455412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff002c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.455441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.455569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d3d300d3 cdw11:d300d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.455588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.455710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d3d300d3 cdw11:d300d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.455728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.455854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d3d300d3 cdw11:d300d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.455874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.994 #57 NEW cov: 11816 ft: 15321 corp: 24/569b lim: 35 exec/s: 57 rss: 69Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:07:37.994 [2024-11-18 19:05:56.505354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.505383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.505507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.505526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.505653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0034 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.505671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.994 #58 NEW cov: 11816 ft: 15367 corp: 25/596b lim: 35 exec/s: 58 rss: 70Mb L: 27/31 MS: 1 CopyPart- 00:07:37.994 [2024-11-18 19:05:56.555785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.555812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.555943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.555962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.556092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.556110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.994 [2024-11-18 19:05:56.556238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ea00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.994 [2024-11-18 19:05:56.556257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.994 #59 NEW cov: 11816 ft: 15377 corp: 26/628b lim: 35 exec/s: 59 rss: 70Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:38.254 [2024-11-18 19:05:56.605646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:c0c0000a cdw11:c000c0c0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.605674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.605809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c0c000c0 cdw11:c000c0c0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.605827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.605954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c0c000c0 cdw11:c000c0c0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.605970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.254 #60 NEW cov: 11816 ft: 15397 corp: 27/653b lim: 35 exec/s: 60 rss: 70Mb L: 25/32 MS: 1 ChangeByte- 00:07:38.254 [2024-11-18 19:05:56.655160] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.254 [2024-11-18 19:05:56.655504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b009000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.655535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.655655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.655691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.254 #61 NEW cov: 11816 ft: 15417 corp: 28/672b lim: 35 exec/s: 61 rss: 70Mb L: 19/32 MS: 1 ChangeBinInt- 00:07:38.254 [2024-11-18 19:05:56.705613] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.254 [2024-11-18 19:05:56.705984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.706013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.706150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6e1100fe cdw11:b200c36d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.706167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.706304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:2500ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.706329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.254 #62 NEW cov: 11816 ft: 15438 corp: 29/699b lim: 35 exec/s: 62 rss: 70Mb L: 27/32 MS: 1 PersAutoDict- DE: "\376n\021\303m\262\213\000"- 00:07:38.254 [2024-11-18 19:05:56.765950] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.254 [2024-11-18 19:05:56.766467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:dfdf00df cdw11:df00dfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.766498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.766620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:dfdf00df cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.766637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.766769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.766793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.766918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:d3d300d3 cdw11:d300d3d3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.766935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.254 #63 NEW cov: 11816 ft: 15494 corp: 30/730b lim: 35 exec/s: 63 rss: 70Mb L: 31/32 MS: 1 CrossOver- 00:07:38.254 [2024-11-18 19:05:56.826345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff002c cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.826373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.826498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.826519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.254 [2024-11-18 19:05:56.826636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.254 [2024-11-18 19:05:56.826654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.254 #64 NEW cov: 11816 ft: 15499 corp: 31/755b lim: 35 exec/s: 64 rss: 70Mb L: 25/32 MS: 1 InsertRepeatedBytes- 00:07:38.513 [2024-11-18 19:05:56.876704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:00003a01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.513 [2024-11-18 19:05:56.876732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.513 [2024-11-18 19:05:56.876858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.513 [2024-11-18 19:05:56.876877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.513 [2024-11-18 19:05:56.876999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:70ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.513 [2024-11-18 19:05:56.877019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.513 [2024-11-18 19:05:56.877142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.513 [2024-11-18 19:05:56.877161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.514 #65 NEW cov: 11816 ft: 15500 corp: 32/786b lim: 35 exec/s: 65 rss: 70Mb L: 31/32 MS: 1 InsertByte- 00:07:38.514 [2024-11-18 19:05:56.936418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ff34 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:56.936445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.514 [2024-11-18 19:05:56.936575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:56.936594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.514 #66 NEW cov: 11816 ft: 15522 corp: 33/803b lim: 35 exec/s: 66 rss: 70Mb L: 17/32 MS: 1 EraseBytes- 00:07:38.514 [2024-11-18 19:05:56.985883] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.514 [2024-11-18 19:05:56.986199] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.514 [2024-11-18 19:05:56.986577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:56.986604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.514 [2024-11-18 19:05:56.986735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f0000 cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:56.986755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.514 [2024-11-18 19:05:56.986891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:7f7f007f cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:56.986908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.514 [2024-11-18 19:05:56.987040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:56.987065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.514 #67 NEW cov: 11816 ft: 15570 corp: 34/834b lim: 35 exec/s: 67 rss: 70Mb L: 31/32 MS: 1 InsertRepeatedBytes- 00:07:38.514 [2024-11-18 19:05:57.036646] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.514 [2024-11-18 19:05:57.037030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:57.037058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.514 [2024-11-18 19:05:57.037339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00df0000 cdw11:df00dfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.514 [2024-11-18 19:05:57.037362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.514 #68 NEW cov: 11816 ft: 15575 corp: 35/860b lim: 35 exec/s: 34 rss: 70Mb L: 26/32 MS: 1 CrossOver- 00:07:38.514 #68 DONE cov: 11816 ft: 15575 corp: 35/860b lim: 35 exec/s: 34 rss: 70Mb 00:07:38.514 ###### Recommended dictionary. ###### 00:07:38.514 "\001\000" # Uses: 4 00:07:38.514 "\376n\021\303m\262\213\000" # Uses: 2 00:07:38.514 ###### End of recommended dictionary. ###### 00:07:38.514 Done 68 runs in 2 second(s) 00:07:38.773 19:05:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:38.773 19:05:57 -- ../common.sh@72 -- # (( i++ )) 00:07:38.773 19:05:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.773 19:05:57 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:38.773 19:05:57 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:38.773 19:05:57 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.773 19:05:57 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.773 19:05:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:38.773 19:05:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:38.773 19:05:57 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:38.773 19:05:57 -- nvmf/run.sh@29 -- # port=4403 00:07:38.773 19:05:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:38.773 19:05:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:38.773 19:05:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.773 19:05:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:38.773 [2024-11-18 19:05:57.229671] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:38.773 [2024-11-18 19:05:57.229735] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1299841 ] 00:07:38.773 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.033 [2024-11-18 19:05:57.478981] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.033 [2024-11-18 19:05:57.568609] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.033 [2024-11-18 19:05:57.568745] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.033 [2024-11-18 19:05:57.626906] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:39.296 [2024-11-18 19:05:57.643261] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:39.296 INFO: Running with entropic power schedule (0xFF, 100). 00:07:39.296 INFO: Seed: 2043886226 00:07:39.296 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:39.296 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:39.296 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:39.296 INFO: A corpus is not provided, starting from an empty corpus 00:07:39.296 #2 INITED exec/s: 0 rss: 60Mb 00:07:39.296 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:39.296 This may also happen if the target rejected all inputs we tried so far 00:07:39.296 [2024-11-18 19:05:57.692154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.296 [2024-11-18 19:05:57.692184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.646 NEW_FUNC[1/676]: 0x43f858 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:39.646 NEW_FUNC[2/676]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.646 #6 NEW cov: 11733 ft: 11734 corp: 2/11b lim: 20 exec/s: 0 rss: 68Mb L: 10/10 MS: 4 ChangeBit-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:39.646 #11 NEW cov: 11846 ft: 12410 corp: 3/21b lim: 20 exec/s: 0 rss: 69Mb L: 10/10 MS: 5 ChangeByte-InsertByte-CopyPart-ChangeByte-CMP- DE: "2 \0008\005\177\000\000"- 00:07:39.646 #12 NEW cov: 11869 ft: 13022 corp: 4/40b lim: 20 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:39.646 #13 NEW cov: 11958 ft: 13381 corp: 5/54b lim: 20 exec/s: 0 rss: 69Mb L: 14/19 MS: 1 EraseBytes- 00:07:39.646 #14 NEW cov: 11958 ft: 13447 corp: 6/74b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 InsertByte- 00:07:39.646 #15 NEW cov: 11958 ft: 13644 corp: 7/92b lim: 20 exec/s: 0 rss: 69Mb L: 18/20 MS: 1 InsertRepeatedBytes- 00:07:39.646 #16 NEW cov: 11958 ft: 13701 corp: 8/102b lim: 20 exec/s: 0 rss: 69Mb L: 10/20 MS: 1 CrossOver- 00:07:39.905 #22 NEW cov: 11958 ft: 13801 corp: 9/120b lim: 20 exec/s: 0 rss: 69Mb L: 18/20 MS: 1 ChangeBinInt- 00:07:39.905 #25 NEW cov: 11958 ft: 14131 corp: 10/127b lim: 20 exec/s: 0 rss: 69Mb L: 7/20 MS: 3 ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:07:39.905 #26 NEW cov: 11958 ft: 14152 corp: 11/147b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 InsertByte- 00:07:39.905 #27 NEW cov: 11958 ft: 14188 corp: 12/165b lim: 20 exec/s: 0 rss: 69Mb L: 18/20 MS: 1 CrossOver- 00:07:39.905 #28 NEW cov: 11958 ft: 14203 corp: 13/183b lim: 20 exec/s: 0 rss: 69Mb L: 18/20 MS: 1 ChangeByte- 00:07:39.905 #29 NEW cov: 11958 ft: 14237 corp: 14/203b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:39.905 #30 NEW cov: 11958 ft: 14261 corp: 15/210b lim: 20 exec/s: 0 rss: 70Mb L: 7/20 MS: 1 EraseBytes- 00:07:40.164 [2024-11-18 19:05:58.514269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.164 [2024-11-18 19:05:58.514306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.164 #31 NEW cov: 11958 ft: 14360 corp: 16/220b lim: 20 exec/s: 0 rss: 70Mb L: 10/20 MS: 1 CopyPart- 00:07:40.164 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:40.164 #32 NEW cov: 11981 ft: 14400 corp: 17/228b lim: 20 exec/s: 0 rss: 70Mb L: 8/20 MS: 1 InsertByte- 00:07:40.164 #33 NEW cov: 11981 ft: 14417 corp: 18/241b lim: 20 exec/s: 0 rss: 70Mb L: 13/20 MS: 1 EraseBytes- 00:07:40.164 #34 NEW cov: 11981 ft: 14425 corp: 19/254b lim: 20 exec/s: 0 rss: 70Mb L: 13/20 MS: 1 ChangeBit- 00:07:40.164 #35 NEW cov: 11981 ft: 14518 corp: 20/272b lim: 20 exec/s: 35 rss: 70Mb L: 18/20 MS: 1 ChangeBit- 00:07:40.164 #36 NEW cov: 11981 ft: 14584 corp: 21/291b lim: 20 exec/s: 36 rss: 70Mb L: 19/20 MS: 1 InsertByte- 00:07:40.423 #37 NEW cov: 11981 ft: 14627 corp: 22/309b lim: 20 exec/s: 37 rss: 70Mb L: 18/20 MS: 1 ShuffleBytes- 00:07:40.423 NEW_FUNC[1/2]: 0x1279a68 in nvmf_transport_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/transport.c:773 00:07:40.423 NEW_FUNC[2/2]: 0x129ab28 in nvmf_tcp_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3493 00:07:40.423 #38 NEW cov: 12038 ft: 14735 corp: 23/319b lim: 20 exec/s: 38 rss: 70Mb L: 10/20 MS: 1 ChangeByte- 00:07:40.423 #39 NEW cov: 12038 ft: 14740 corp: 24/326b lim: 20 exec/s: 39 rss: 70Mb L: 7/20 MS: 1 CopyPart- 00:07:40.423 #40 NEW cov: 12038 ft: 14755 corp: 25/340b lim: 20 exec/s: 40 rss: 70Mb L: 14/20 MS: 1 ChangeBit- 00:07:40.423 [2024-11-18 19:05:58.915473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.423 [2024-11-18 19:05:58.915502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.423 #41 NEW cov: 12038 ft: 14765 corp: 26/350b lim: 20 exec/s: 41 rss: 70Mb L: 10/20 MS: 1 ShuffleBytes- 00:07:40.423 #42 NEW cov: 12038 ft: 14772 corp: 27/363b lim: 20 exec/s: 42 rss: 70Mb L: 13/20 MS: 1 ChangeBit- 00:07:40.423 #43 NEW cov: 12038 ft: 14816 corp: 28/378b lim: 20 exec/s: 43 rss: 70Mb L: 15/20 MS: 1 CrossOver- 00:07:40.682 #44 NEW cov: 12038 ft: 14826 corp: 29/383b lim: 20 exec/s: 44 rss: 70Mb L: 5/20 MS: 1 EraseBytes- 00:07:40.683 #45 NEW cov: 12038 ft: 14846 corp: 30/397b lim: 20 exec/s: 45 rss: 70Mb L: 14/20 MS: 1 CrossOver- 00:07:40.683 #46 NEW cov: 12038 ft: 14855 corp: 31/407b lim: 20 exec/s: 46 rss: 70Mb L: 10/20 MS: 1 PersAutoDict- DE: "2 \0008\005\177\000\000"- 00:07:40.683 #52 NEW cov: 12038 ft: 14881 corp: 32/427b lim: 20 exec/s: 52 rss: 70Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:40.683 #53 NEW cov: 12038 ft: 14892 corp: 33/442b lim: 20 exec/s: 53 rss: 70Mb L: 15/20 MS: 1 InsertByte- 00:07:40.683 #54 NEW cov: 12038 ft: 14905 corp: 34/460b lim: 20 exec/s: 54 rss: 70Mb L: 18/20 MS: 1 CrossOver- 00:07:40.941 #55 NEW cov: 12038 ft: 14950 corp: 35/478b lim: 20 exec/s: 55 rss: 70Mb L: 18/20 MS: 1 CrossOver- 00:07:40.941 #56 NEW cov: 12038 ft: 14958 corp: 36/491b lim: 20 exec/s: 56 rss: 70Mb L: 13/20 MS: 1 ChangeBit- 00:07:40.941 #57 NEW cov: 12038 ft: 14966 corp: 37/509b lim: 20 exec/s: 57 rss: 70Mb L: 18/20 MS: 1 PersAutoDict- DE: "2 \0008\005\177\000\000"- 00:07:40.941 #58 NEW cov: 12038 ft: 14971 corp: 38/528b lim: 20 exec/s: 58 rss: 70Mb L: 19/20 MS: 1 InsertByte- 00:07:40.941 #59 NEW cov: 12038 ft: 14981 corp: 39/547b lim: 20 exec/s: 59 rss: 70Mb L: 19/20 MS: 1 CopyPart- 00:07:40.941 #60 NEW cov: 12038 ft: 14984 corp: 40/566b lim: 20 exec/s: 60 rss: 70Mb L: 19/20 MS: 1 ChangeBit- 00:07:40.941 #61 NEW cov: 12038 ft: 14985 corp: 41/571b lim: 20 exec/s: 61 rss: 70Mb L: 5/20 MS: 1 ShuffleBytes- 00:07:41.201 #64 NEW cov: 12038 ft: 15013 corp: 42/586b lim: 20 exec/s: 64 rss: 70Mb L: 15/20 MS: 3 EraseBytes-ChangeByte-CrossOver- 00:07:41.201 #65 NEW cov: 12038 ft: 15025 corp: 43/602b lim: 20 exec/s: 65 rss: 70Mb L: 16/20 MS: 1 InsertByte- 00:07:41.201 #66 NEW cov: 12038 ft: 15044 corp: 44/620b lim: 20 exec/s: 66 rss: 70Mb L: 18/20 MS: 1 PersAutoDict- DE: "2 \0008\005\177\000\000"- 00:07:41.201 #67 NEW cov: 12038 ft: 15076 corp: 45/637b lim: 20 exec/s: 33 rss: 70Mb L: 17/20 MS: 1 EraseBytes- 00:07:41.201 #67 DONE cov: 12038 ft: 15076 corp: 45/637b lim: 20 exec/s: 33 rss: 70Mb 00:07:41.201 ###### Recommended dictionary. ###### 00:07:41.201 "2 \0008\005\177\000\000" # Uses: 3 00:07:41.201 ###### End of recommended dictionary. ###### 00:07:41.201 Done 67 runs in 2 second(s) 00:07:41.461 19:05:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:41.461 19:05:59 -- ../common.sh@72 -- # (( i++ )) 00:07:41.461 19:05:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:41.461 19:05:59 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:41.461 19:05:59 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:41.461 19:05:59 -- nvmf/run.sh@24 -- # local timen=1 00:07:41.461 19:05:59 -- nvmf/run.sh@25 -- # local core=0x1 00:07:41.461 19:05:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:41.461 19:05:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:41.461 19:05:59 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:41.461 19:05:59 -- nvmf/run.sh@29 -- # port=4404 00:07:41.461 19:05:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:41.461 19:05:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:41.461 19:05:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:41.461 19:05:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:41.461 [2024-11-18 19:05:59.859216] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:41.461 [2024-11-18 19:05:59.859289] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1300285 ] 00:07:41.461 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.720 [2024-11-18 19:06:00.118456] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.720 [2024-11-18 19:06:00.196587] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:41.720 [2024-11-18 19:06:00.196741] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.720 [2024-11-18 19:06:00.255201] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.721 [2024-11-18 19:06:00.271541] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:41.721 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.721 INFO: Seed: 377963459 00:07:41.721 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:41.721 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:41.721 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:41.721 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.721 #2 INITED exec/s: 0 rss: 61Mb 00:07:41.721 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.721 This may also happen if the target rejected all inputs we tried so far 00:07:41.980 [2024-11-18 19:06:00.338159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.980 [2024-11-18 19:06:00.338197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.239 NEW_FUNC[1/669]: 0x440958 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:42.239 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.239 #18 NEW cov: 11599 ft: 11602 corp: 2/11b lim: 35 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:42.239 [2024-11-18 19:06:00.668464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.239 [2024-11-18 19:06:00.668506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.239 NEW_FUNC[1/2]: 0xedf2f8 in rte_get_tsc_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/include/rte_cycles.h:61 00:07:42.239 NEW_FUNC[2/2]: 0xedf368 in rte_rdtsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/include/rte_cycles.h:31 00:07:42.239 #19 NEW cov: 11714 ft: 12274 corp: 3/24b lim: 35 exec/s: 0 rss: 68Mb L: 13/13 MS: 1 CrossOver- 00:07:42.239 [2024-11-18 19:06:00.728518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b270008b cdw11:7a750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.239 [2024-11-18 19:06:00.728554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.239 #20 NEW cov: 11720 ft: 12487 corp: 4/34b lim: 35 exec/s: 0 rss: 68Mb L: 10/13 MS: 1 CMP- DE: "\000\213\262pzu\012@"- 00:07:42.239 [2024-11-18 19:06:00.778798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:70b2008b cdw11:7a750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.239 [2024-11-18 19:06:00.778830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.239 #21 NEW cov: 11805 ft: 12785 corp: 5/44b lim: 35 exec/s: 0 rss: 68Mb L: 10/13 MS: 1 ShuffleBytes- 00:07:42.239 [2024-11-18 19:06:00.839214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52850a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.239 [2024-11-18 19:06:00.839241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.239 [2024-11-18 19:06:00.839370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.239 [2024-11-18 19:06:00.839385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.497 #22 NEW cov: 11805 ft: 13706 corp: 6/58b lim: 35 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 InsertByte- 00:07:42.497 [2024-11-18 19:06:00.899356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52850a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.497 [2024-11-18 19:06:00.899385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.497 [2024-11-18 19:06:00.899517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:52525352 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.497 [2024-11-18 19:06:00.899534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.497 #23 NEW cov: 11805 ft: 13782 corp: 7/72b lim: 35 exec/s: 0 rss: 69Mb L: 14/14 MS: 1 ChangeBit- 00:07:42.497 [2024-11-18 19:06:00.959336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:70b2008b cdw11:7a750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.497 [2024-11-18 19:06:00.959365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.497 #24 NEW cov: 11805 ft: 13868 corp: 8/83b lim: 35 exec/s: 0 rss: 69Mb L: 11/14 MS: 1 InsertByte- 00:07:42.497 [2024-11-18 19:06:01.009019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b270008b cdw11:52750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.497 [2024-11-18 19:06:01.009046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.497 #25 NEW cov: 11805 ft: 14065 corp: 9/93b lim: 35 exec/s: 0 rss: 69Mb L: 10/14 MS: 1 ShuffleBytes- 00:07:42.497 [2024-11-18 19:06:01.059567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a0a cdw11:85520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.497 [2024-11-18 19:06:01.059595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.497 #28 NEW cov: 11805 ft: 14097 corp: 10/103b lim: 35 exec/s: 0 rss: 69Mb L: 10/14 MS: 3 InsertByte-EraseBytes-CrossOver- 00:07:42.756 [2024-11-18 19:06:01.110005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b270008b cdw11:7a750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.110034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.756 [2024-11-18 19:06:01.110154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:8bb24001 cdw11:76130001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.110173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.756 #29 NEW cov: 11805 ft: 14129 corp: 11/121b lim: 35 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 CMP- DE: "\001\213\262v\023\213Q:"- 00:07:42.756 [2024-11-18 19:06:01.159982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.160010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.756 #30 NEW cov: 11805 ft: 14201 corp: 12/130b lim: 35 exec/s: 0 rss: 69Mb L: 9/18 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\001"- 00:07:42.756 [2024-11-18 19:06:01.210331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a0a cdw11:85520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.210360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.756 [2024-11-18 19:06:01.210481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.210499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.756 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:42.756 #31 NEW cov: 11828 ft: 14264 corp: 13/148b lim: 35 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:07:42.756 [2024-11-18 19:06:01.270591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52850a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.270620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.756 [2024-11-18 19:06:01.270736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:52525372 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.270753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.756 #32 NEW cov: 11828 ft: 14284 corp: 14/162b lim: 35 exec/s: 0 rss: 69Mb L: 14/18 MS: 1 ChangeBit- 00:07:42.756 [2024-11-18 19:06:01.331075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.331103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.756 [2024-11-18 19:06:01.331242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5252010a cdw11:85520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.331259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.756 [2024-11-18 19:06:01.331384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.756 [2024-11-18 19:06:01.331400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.756 #33 NEW cov: 11828 ft: 14513 corp: 15/184b lim: 35 exec/s: 33 rss: 69Mb L: 22/22 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:07:43.015 [2024-11-18 19:06:01.381215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.381245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.015 [2024-11-18 19:06:01.381372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c5c501c5 cdw11:c5c50003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.381390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.015 [2024-11-18 19:06:01.381511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c5c5c5c5 cdw11:c5c50003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.381531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.015 #34 NEW cov: 11828 ft: 14530 corp: 16/211b lim: 35 exec/s: 34 rss: 69Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:43.015 [2024-11-18 19:06:01.440856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.440885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.015 #35 NEW cov: 11828 ft: 14550 corp: 17/221b lim: 35 exec/s: 35 rss: 69Mb L: 10/27 MS: 1 ShuffleBytes- 00:07:43.015 [2024-11-18 19:06:01.491236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52850a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.491265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.015 [2024-11-18 19:06:01.491384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:eb709879 cdw11:b28b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.491401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.015 #36 NEW cov: 11828 ft: 14572 corp: 18/235b lim: 35 exec/s: 36 rss: 69Mb L: 14/27 MS: 1 CMP- DE: "F\230y\353p\262\213\000"- 00:07:43.015 [2024-11-18 19:06:01.541608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.541636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.015 [2024-11-18 19:06:01.541754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:52520001 cdw11:85520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.015 [2024-11-18 19:06:01.541772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.016 [2024-11-18 19:06:01.541882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.016 [2024-11-18 19:06:01.541898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.016 #37 NEW cov: 11828 ft: 14590 corp: 19/257b lim: 35 exec/s: 37 rss: 69Mb L: 22/27 MS: 1 ShuffleBytes- 00:07:43.016 [2024-11-18 19:06:01.601853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.016 [2024-11-18 19:06:01.601880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.016 [2024-11-18 19:06:01.602009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff5252 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.016 [2024-11-18 19:06:01.602027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.016 [2024-11-18 19:06:01.602144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.016 [2024-11-18 19:06:01.602160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.275 #38 NEW cov: 11828 ft: 14656 corp: 20/278b lim: 35 exec/s: 38 rss: 69Mb L: 21/27 MS: 1 InsertRepeatedBytes- 00:07:43.275 [2024-11-18 19:06:01.651343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52850a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-18 19:06:01.651374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.275 #39 NEW cov: 11828 ft: 14689 corp: 21/290b lim: 35 exec/s: 39 rss: 69Mb L: 12/27 MS: 1 EraseBytes- 00:07:43.275 [2024-11-18 19:06:01.701541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a52 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-18 19:06:01.701572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.275 #40 NEW cov: 11828 ft: 14707 corp: 22/299b lim: 35 exec/s: 40 rss: 69Mb L: 9/27 MS: 1 EraseBytes- 00:07:43.275 [2024-11-18 19:06:01.762396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b270008b cdw11:7a750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-18 19:06:01.762426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.275 [2024-11-18 19:06:01.762558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5252400a cdw11:85520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-18 19:06:01.762577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.275 [2024-11-18 19:06:01.762696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:52525253 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.275 [2024-11-18 19:06:01.762715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.275 #41 NEW cov: 11828 ft: 14777 corp: 23/321b lim: 35 exec/s: 41 rss: 69Mb L: 22/27 MS: 1 PersAutoDict- DE: "\000\213\262pzu\012@"- 00:07:43.275 [2024-11-18 19:06:01.812575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:efef008b cdw11:efef0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.276 [2024-11-18 19:06:01.812602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.276 [2024-11-18 19:06:01.812732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efef0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.276 [2024-11-18 19:06:01.812748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.276 [2024-11-18 19:06:01.812874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7a7570b2 cdw11:0a400002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.276 [2024-11-18 19:06:01.812890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.276 #42 NEW cov: 11828 ft: 14785 corp: 24/343b lim: 35 exec/s: 42 rss: 69Mb L: 22/27 MS: 1 InsertRepeatedBytes- 00:07:43.276 [2024-11-18 19:06:01.862085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.276 [2024-11-18 19:06:01.862112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.535 #43 NEW cov: 11828 ft: 14836 corp: 25/353b lim: 35 exec/s: 43 rss: 69Mb L: 10/27 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:43.535 [2024-11-18 19:06:01.922317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a52 cdw11:01520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.535 [2024-11-18 19:06:01.922347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.535 #44 NEW cov: 11828 ft: 14851 corp: 26/362b lim: 35 exec/s: 44 rss: 70Mb L: 9/27 MS: 1 CrossOver- 00:07:43.535 [2024-11-18 19:06:01.982507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:70b2008b cdw11:7a750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.535 [2024-11-18 19:06:01.982537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.535 #45 NEW cov: 11828 ft: 14867 corp: 27/372b lim: 35 exec/s: 45 rss: 70Mb L: 10/27 MS: 1 ChangeByte- 00:07:43.535 [2024-11-18 19:06:02.032845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:efef008b cdw11:efef0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.535 [2024-11-18 19:06:02.032870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.535 [2024-11-18 19:06:02.032995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efef0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.535 [2024-11-18 19:06:02.033012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.535 #46 NEW cov: 11828 ft: 14902 corp: 28/388b lim: 35 exec/s: 46 rss: 70Mb L: 16/27 MS: 1 EraseBytes- 00:07:43.535 [2024-11-18 19:06:02.092779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:52520a0a cdw11:85520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.535 [2024-11-18 19:06:02.092806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.535 #47 NEW cov: 11828 ft: 14921 corp: 29/397b lim: 35 exec/s: 47 rss: 70Mb L: 9/27 MS: 1 CrossOver- 00:07:43.795 [2024-11-18 19:06:02.143562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ef6f008b cdw11:efef0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.795 [2024-11-18 19:06:02.143589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.795 [2024-11-18 19:06:02.143712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efef0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.796 [2024-11-18 19:06:02.143728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.796 [2024-11-18 19:06:02.143838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7a7570b2 cdw11:0a400002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.796 [2024-11-18 19:06:02.143854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.796 #48 NEW cov: 11828 ft: 14934 corp: 30/419b lim: 35 exec/s: 48 rss: 70Mb L: 22/27 MS: 1 ChangeBit- 00:07:43.796 [2024-11-18 19:06:02.193024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b270008b cdw11:7a750001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.796 [2024-11-18 19:06:02.193050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.796 #49 NEW cov: 11828 ft: 15007 corp: 31/429b lim: 35 exec/s: 49 rss: 70Mb L: 10/27 MS: 1 CopyPart- 00:07:43.796 [2024-11-18 19:06:02.243166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:41ffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.796 [2024-11-18 19:06:02.243193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.796 #50 NEW cov: 11828 ft: 15031 corp: 32/439b lim: 35 exec/s: 50 rss: 70Mb L: 10/27 MS: 1 ChangeByte- 00:07:43.796 [2024-11-18 19:06:02.294049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:9d9d0a52 cdw11:9d9d0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.796 [2024-11-18 19:06:02.294076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.796 [2024-11-18 19:06:02.294211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:9d9d9d9d cdw11:9d9d0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.796 [2024-11-18 19:06:02.294234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.796 [2024-11-18 19:06:02.294354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:9d9d9d9d cdw11:9d9d0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.796 [2024-11-18 19:06:02.294371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.796 #51 NEW cov: 11828 ft: 15039 corp: 33/466b lim: 35 exec/s: 25 rss: 70Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:43.796 #51 DONE cov: 11828 ft: 15039 corp: 33/466b lim: 35 exec/s: 25 rss: 70Mb 00:07:43.796 ###### Recommended dictionary. ###### 00:07:43.796 "\000\213\262pzu\012@" # Uses: 1 00:07:43.796 "\001\213\262v\023\213Q:" # Uses: 0 00:07:43.796 "\000\000\000\000\000\000\000\001" # Uses: 2 00:07:43.796 "F\230y\353p\262\213\000" # Uses: 0 00:07:43.796 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:43.796 ###### End of recommended dictionary. ###### 00:07:43.796 Done 51 runs in 2 second(s) 00:07:44.056 19:06:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:44.056 19:06:02 -- ../common.sh@72 -- # (( i++ )) 00:07:44.056 19:06:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.056 19:06:02 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:44.056 19:06:02 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:44.056 19:06:02 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.056 19:06:02 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.056 19:06:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:44.056 19:06:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:44.056 19:06:02 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:44.056 19:06:02 -- nvmf/run.sh@29 -- # port=4405 00:07:44.056 19:06:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:44.056 19:06:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:44.056 19:06:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.056 19:06:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:44.056 [2024-11-18 19:06:02.489671] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:44.056 [2024-11-18 19:06:02.489734] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1300812 ] 00:07:44.056 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.315 [2024-11-18 19:06:02.747324] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.315 [2024-11-18 19:06:02.833976] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:44.315 [2024-11-18 19:06:02.834103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.315 [2024-11-18 19:06:02.892370] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:44.315 [2024-11-18 19:06:02.908713] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:44.574 INFO: Running with entropic power schedule (0xFF, 100). 00:07:44.575 INFO: Seed: 3016941022 00:07:44.575 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:44.575 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:44.575 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:44.575 INFO: A corpus is not provided, starting from an empty corpus 00:07:44.575 #2 INITED exec/s: 0 rss: 60Mb 00:07:44.575 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:44.575 This may also happen if the target rejected all inputs we tried so far 00:07:44.575 [2024-11-18 19:06:02.953383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.575 [2024-11-18 19:06:02.953418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.575 [2024-11-18 19:06:02.953449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.575 [2024-11-18 19:06:02.953464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.834 NEW_FUNC[1/671]: 0x442af8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:44.834 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.834 #3 NEW cov: 11608 ft: 11609 corp: 2/24b lim: 45 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:07:44.834 [2024-11-18 19:06:03.274084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.834 [2024-11-18 19:06:03.274125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.834 #4 NEW cov: 11725 ft: 12729 corp: 3/33b lim: 45 exec/s: 0 rss: 68Mb L: 9/23 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:07:44.834 [2024-11-18 19:06:03.334199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.834 [2024-11-18 19:06:03.334233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.834 [2024-11-18 19:06:03.334264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41430002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.834 [2024-11-18 19:06:03.334279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.834 #5 NEW cov: 11731 ft: 13157 corp: 4/56b lim: 45 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ChangeBinInt- 00:07:44.834 [2024-11-18 19:06:03.394291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.834 [2024-11-18 19:06:03.394323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.834 [2024-11-18 19:06:03.394353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41430003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.834 [2024-11-18 19:06:03.394368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.093 #6 NEW cov: 11816 ft: 13333 corp: 5/79b lim: 45 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ChangeByte- 00:07:45.093 [2024-11-18 19:06:03.464438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5b020200 cdw11:005b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.464470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.093 #9 NEW cov: 11816 ft: 13522 corp: 6/89b lim: 45 exec/s: 0 rss: 68Mb L: 10/23 MS: 3 EraseBytes-ChangeByte-CopyPart- 00:07:45.093 [2024-11-18 19:06:03.534690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.534720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.093 [2024-11-18 19:06:03.534751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:be430003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.534770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.093 #10 NEW cov: 11816 ft: 13640 corp: 7/112b lim: 45 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ChangeBinInt- 00:07:45.093 [2024-11-18 19:06:03.594890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41430a41 cdw11:6f410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.594921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.093 [2024-11-18 19:06:03.594951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.594966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.093 [2024-11-18 19:06:03.594993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:41434141 cdw11:6f410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.595008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.093 #11 NEW cov: 11816 ft: 13978 corp: 8/142b lim: 45 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 CopyPart- 00:07:45.093 [2024-11-18 19:06:03.645023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.645053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.093 [2024-11-18 19:06:03.645083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:be430003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.645099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.093 [2024-11-18 19:06:03.645126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:41024141 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.093 [2024-11-18 19:06:03.645141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.093 #12 NEW cov: 11816 ft: 13987 corp: 9/173b lim: 45 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:45.353 [2024-11-18 19:06:03.705050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.705080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.353 #13 NEW cov: 11816 ft: 14023 corp: 10/186b lim: 45 exec/s: 0 rss: 68Mb L: 13/31 MS: 1 CrossOver- 00:07:45.353 [2024-11-18 19:06:03.755247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2c410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.755277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.353 [2024-11-18 19:06:03.755309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.755324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.353 #14 NEW cov: 11816 ft: 14045 corp: 11/209b lim: 45 exec/s: 0 rss: 68Mb L: 23/31 MS: 1 ChangeByte- 00:07:45.353 [2024-11-18 19:06:03.805300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.805330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.353 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:45.353 #15 NEW cov: 11833 ft: 14159 corp: 12/225b lim: 45 exec/s: 0 rss: 68Mb L: 16/31 MS: 1 EraseBytes- 00:07:45.353 [2024-11-18 19:06:03.865582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2c410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.865613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.353 [2024-11-18 19:06:03.865644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:43414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.865660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.353 #16 NEW cov: 11833 ft: 14201 corp: 13/248b lim: 45 exec/s: 0 rss: 69Mb L: 23/31 MS: 1 ChangeBit- 00:07:45.353 [2024-11-18 19:06:03.925757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.925789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.353 [2024-11-18 19:06:03.925821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.353 [2024-11-18 19:06:03.925836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.612 #17 NEW cov: 11833 ft: 14248 corp: 14/272b lim: 45 exec/s: 17 rss: 69Mb L: 24/31 MS: 1 InsertByte- 00:07:45.612 [2024-11-18 19:06:03.975951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2c410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.612 [2024-11-18 19:06:03.975982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.612 [2024-11-18 19:06:03.976013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:43414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.612 [2024-11-18 19:06:03.976028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.612 [2024-11-18 19:06:03.976054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.612 [2024-11-18 19:06:03.976069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:03.976096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:03.976110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.613 #18 NEW cov: 11833 ft: 14594 corp: 15/311b lim: 45 exec/s: 18 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:45.613 [2024-11-18 19:06:04.036028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2c410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.036058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.036088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:43414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.036103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.613 #19 NEW cov: 11833 ft: 14672 corp: 16/334b lim: 45 exec/s: 19 rss: 69Mb L: 23/39 MS: 1 ShuffleBytes- 00:07:45.613 [2024-11-18 19:06:04.086172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41450002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.086204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.086237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.086254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.613 #20 NEW cov: 11833 ft: 14691 corp: 17/357b lim: 45 exec/s: 20 rss: 69Mb L: 23/39 MS: 1 ChangeBit- 00:07:45.613 [2024-11-18 19:06:04.136377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:be430abe cdw11:6f410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.136408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.136439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.136455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.136482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:be430003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.136497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.136523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:41024141 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.136539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.613 #21 NEW cov: 11833 ft: 14710 corp: 18/397b lim: 45 exec/s: 21 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:45.613 [2024-11-18 19:06:04.206605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.206636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.206667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffbebe cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.206682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.206708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.206723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.613 [2024-11-18 19:06:04.206750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:436fbebe cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.613 [2024-11-18 19:06:04.206765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.872 #22 NEW cov: 11833 ft: 14731 corp: 19/435b lim: 45 exec/s: 22 rss: 69Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:07:45.872 [2024-11-18 19:06:04.256535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.256574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.872 #23 NEW cov: 11833 ft: 14817 corp: 20/448b lim: 45 exec/s: 23 rss: 69Mb L: 13/40 MS: 1 ShuffleBytes- 00:07:45.872 [2024-11-18 19:06:04.326783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.326813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.872 [2024-11-18 19:06:04.326844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:4141ac41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.326860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.872 #24 NEW cov: 11833 ft: 14841 corp: 21/473b lim: 45 exec/s: 24 rss: 69Mb L: 25/40 MS: 1 InsertByte- 00:07:45.872 [2024-11-18 19:06:04.386913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:be430abe cdw11:6f410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.386944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.872 [2024-11-18 19:06:04.386974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41024141 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.386989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.872 #25 NEW cov: 11833 ft: 14851 corp: 22/495b lim: 45 exec/s: 25 rss: 69Mb L: 22/40 MS: 1 EraseBytes- 00:07:45.872 [2024-11-18 19:06:04.447285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:be430abe cdw11:6f410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.447320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.872 [2024-11-18 19:06:04.447353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:413b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.447369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.872 [2024-11-18 19:06:04.447398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebe0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.447414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.872 [2024-11-18 19:06:04.447443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:41414141 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.872 [2024-11-18 19:06:04.447459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.132 #26 NEW cov: 11833 ft: 14876 corp: 23/536b lim: 45 exec/s: 26 rss: 69Mb L: 41/41 MS: 1 InsertByte- 00:07:46.132 [2024-11-18 19:06:04.507451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:be430abe cdw11:6f410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.507486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.132 [2024-11-18 19:06:04.507520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.507538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.132 [2024-11-18 19:06:04.507577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:be430003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.507599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.132 [2024-11-18 19:06:04.507630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:41024141 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.507647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.132 #27 NEW cov: 11833 ft: 14914 corp: 24/577b lim: 45 exec/s: 27 rss: 69Mb L: 41/41 MS: 1 InsertByte- 00:07:46.132 [2024-11-18 19:06:04.557442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.557475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.132 [2024-11-18 19:06:04.557510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41430003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.557528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.132 #28 NEW cov: 11833 ft: 14925 corp: 25/600b lim: 45 exec/s: 28 rss: 69Mb L: 23/41 MS: 1 ShuffleBytes- 00:07:46.132 [2024-11-18 19:06:04.607616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.607648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.132 [2024-11-18 19:06:04.607682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.607698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.132 [2024-11-18 19:06:04.607727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:43413232 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.607743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.132 #29 NEW cov: 11833 ft: 14940 corp: 26/629b lim: 45 exec/s: 29 rss: 69Mb L: 29/41 MS: 1 InsertRepeatedBytes- 00:07:46.132 [2024-11-18 19:06:04.667592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5b020200 cdw11:005b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.132 [2024-11-18 19:06:04.667622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.132 #30 NEW cov: 11833 ft: 15009 corp: 27/640b lim: 45 exec/s: 30 rss: 69Mb L: 11/41 MS: 1 InsertByte- 00:07:46.392 [2024-11-18 19:06:04.737882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2c410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.737914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.392 [2024-11-18 19:06:04.737947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:43414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.737964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.392 #31 NEW cov: 11833 ft: 15024 corp: 28/663b lim: 45 exec/s: 31 rss: 70Mb L: 23/41 MS: 1 CrossOver- 00:07:46.392 [2024-11-18 19:06:04.798803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:2c410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.798841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.392 [2024-11-18 19:06:04.798921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.798945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.392 #32 NEW cov: 11833 ft: 15168 corp: 29/686b lim: 45 exec/s: 32 rss: 70Mb L: 23/41 MS: 1 ShuffleBytes- 00:07:46.392 [2024-11-18 19:06:04.838690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.838717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.392 #33 NEW cov: 11840 ft: 15211 corp: 30/702b lim: 45 exec/s: 33 rss: 70Mb L: 16/41 MS: 1 ChangeBit- 00:07:46.392 [2024-11-18 19:06:04.878965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:41410a41 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.878991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.392 [2024-11-18 19:06:04.879058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:41414141 cdw11:41410002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.879076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.392 #34 NEW cov: 11840 ft: 15247 corp: 31/726b lim: 45 exec/s: 34 rss: 70Mb L: 24/41 MS: 1 ChangeByte- 00:07:46.392 [2024-11-18 19:06:04.918925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.392 [2024-11-18 19:06:04.918952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.392 #35 NEW cov: 11840 ft: 15330 corp: 32/739b lim: 45 exec/s: 17 rss: 70Mb L: 13/41 MS: 1 ChangeByte- 00:07:46.392 #35 DONE cov: 11840 ft: 15330 corp: 32/739b lim: 45 exec/s: 17 rss: 70Mb 00:07:46.392 ###### Recommended dictionary. ###### 00:07:46.392 "\002\000\000\000\000\000\000\000" # Uses: 1 00:07:46.392 ###### End of recommended dictionary. ###### 00:07:46.392 Done 35 runs in 2 second(s) 00:07:46.652 19:06:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:46.652 19:06:05 -- ../common.sh@72 -- # (( i++ )) 00:07:46.652 19:06:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:46.652 19:06:05 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:46.652 19:06:05 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:46.652 19:06:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:46.652 19:06:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:46.652 19:06:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:46.652 19:06:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:46.652 19:06:05 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:46.652 19:06:05 -- nvmf/run.sh@29 -- # port=4406 00:07:46.652 19:06:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:46.652 19:06:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:46.652 19:06:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:46.652 19:06:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:46.652 [2024-11-18 19:06:05.133420] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:46.652 [2024-11-18 19:06:05.133496] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1301687 ] 00:07:46.652 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.911 [2024-11-18 19:06:05.385741] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.911 [2024-11-18 19:06:05.468996] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:46.911 [2024-11-18 19:06:05.469123] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.171 [2024-11-18 19:06:05.527305] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.171 [2024-11-18 19:06:05.543644] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:47.171 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.171 INFO: Seed: 1354955494 00:07:47.171 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:47.171 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:47.171 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:47.171 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.171 #2 INITED exec/s: 0 rss: 60Mb 00:07:47.171 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.171 This may also happen if the target rejected all inputs we tried so far 00:07:47.171 [2024-11-18 19:06:05.588406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:47.171 [2024-11-18 19:06:05.588442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 NEW_FUNC[1/668]: 0x445308 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:47.431 NEW_FUNC[2/668]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.431 #3 NEW cov: 11522 ft: 11530 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:07:47.431 [2024-11-18 19:06:05.919056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7e cdw11:00000000 00:07:47.431 [2024-11-18 19:06:05.919100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 NEW_FUNC[1/1]: 0x19497b8 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:07:47.431 #4 NEW cov: 11642 ft: 11933 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeByte- 00:07:47.431 [2024-11-18 19:06:05.989083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a7e cdw11:00000000 00:07:47.431 [2024-11-18 19:06:05.989115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 #5 NEW cov: 11648 ft: 12218 corp: 4/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:07:47.690 [2024-11-18 19:06:06.049230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a7c cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.049261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 #6 NEW cov: 11733 ft: 12520 corp: 5/9b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:07:47.690 [2024-11-18 19:06:06.109374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a6a cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.109404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 #10 NEW cov: 11733 ft: 12766 corp: 6/11b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 4 EraseBytes-ChangeBit-ShuffleBytes-CrossOver- 00:07:47.690 [2024-11-18 19:06:06.159478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.159508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 #12 NEW cov: 11733 ft: 12884 corp: 7/14b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 2 ShuffleBytes-CrossOver- 00:07:47.690 [2024-11-18 19:06:06.209794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.209823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 [2024-11-18 19:06:06.209853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b273 cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.209868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.690 [2024-11-18 19:06:06.209894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ab13 cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.209908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.690 [2024-11-18 19:06:06.209934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00009ab0 cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.209949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.690 [2024-11-18 19:06:06.209975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.209990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.690 #13 NEW cov: 11733 ft: 13283 corp: 8/24b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CMP- DE: "\377\212\262s\253\023\232\260"- 00:07:47.690 [2024-11-18 19:06:06.269771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2b cdw11:00000000 00:07:47.690 [2024-11-18 19:06:06.269801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 #14 NEW cov: 11733 ft: 13417 corp: 9/26b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:07:47.950 [2024-11-18 19:06:06.329959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2b cdw11:00000000 00:07:47.950 [2024-11-18 19:06:06.329987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 #15 NEW cov: 11733 ft: 13448 corp: 10/28b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:47.950 [2024-11-18 19:06:06.390118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:47.950 [2024-11-18 19:06:06.390148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 #16 NEW cov: 11733 ft: 13463 corp: 11/30b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 CrossOver- 00:07:47.950 [2024-11-18 19:06:06.440215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a7e cdw11:00000000 00:07:47.950 [2024-11-18 19:06:06.440244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:47.950 #17 NEW cov: 11756 ft: 13536 corp: 12/33b lim: 10 exec/s: 0 rss: 69Mb L: 3/10 MS: 1 CrossOver- 00:07:47.950 [2024-11-18 19:06:06.490390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:47.950 [2024-11-18 19:06:06.490420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 [2024-11-18 19:06:06.490449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:47.950 [2024-11-18 19:06:06.490465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.950 #19 NEW cov: 11756 ft: 13718 corp: 13/38b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 2 CopyPart-CMP- DE: "\377\377\377\017"- 00:07:47.950 [2024-11-18 19:06:06.540475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:47.950 [2024-11-18 19:06:06.540504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.209 #22 NEW cov: 11756 ft: 13768 corp: 14/40b lim: 10 exec/s: 22 rss: 69Mb L: 2/10 MS: 3 EraseBytes-ShuffleBytes-InsertByte- 00:07:48.209 [2024-11-18 19:06:06.600701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a01 cdw11:00000000 00:07:48.209 [2024-11-18 19:06:06.600732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.209 #25 NEW cov: 11756 ft: 13799 corp: 15/43b lim: 10 exec/s: 25 rss: 69Mb L: 3/10 MS: 3 EraseBytes-ChangeBit-CMP- DE: "\001\000"- 00:07:48.209 [2024-11-18 19:06:06.660866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001010 cdw11:00000000 00:07:48.209 [2024-11-18 19:06:06.660895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.210 [2024-11-18 19:06:06.660924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001010 cdw11:00000000 00:07:48.210 [2024-11-18 19:06:06.660939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.210 #28 NEW cov: 11756 ft: 13835 corp: 16/48b lim: 10 exec/s: 28 rss: 69Mb L: 5/10 MS: 3 EraseBytes-CrossOver-InsertRepeatedBytes- 00:07:48.210 [2024-11-18 19:06:06.710926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7e cdw11:00000000 00:07:48.210 [2024-11-18 19:06:06.710954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.210 #29 NEW cov: 11756 ft: 13890 corp: 17/50b lim: 10 exec/s: 29 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:48.210 [2024-11-18 19:06:06.761058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:48.210 [2024-11-18 19:06:06.761087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.210 #30 NEW cov: 11756 ft: 13911 corp: 18/52b lim: 10 exec/s: 30 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:48.210 [2024-11-18 19:06:06.811239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:48.210 [2024-11-18 19:06:06.811269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.469 #31 NEW cov: 11756 ft: 13994 corp: 19/55b lim: 10 exec/s: 31 rss: 69Mb L: 3/10 MS: 1 ChangeBinInt- 00:07:48.469 [2024-11-18 19:06:06.871570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a40 cdw11:00000000 00:07:48.469 [2024-11-18 19:06:06.871599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.469 [2024-11-18 19:06:06.871629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.469 [2024-11-18 19:06:06.871644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.469 [2024-11-18 19:06:06.871670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.469 [2024-11-18 19:06:06.871684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.469 [2024-11-18 19:06:06.871710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.469 [2024-11-18 19:06:06.871728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.469 [2024-11-18 19:06:06.871754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000002b cdw11:00000000 00:07:48.469 [2024-11-18 19:06:06.871768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.469 #32 NEW cov: 11756 ft: 14003 corp: 20/65b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 CMP- DE: "@\000\000\000\000\000\000\000"- 00:07:48.469 [2024-11-18 19:06:06.921519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:48.469 [2024-11-18 19:06:06.921555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.469 [2024-11-18 19:06:06.921585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:48.469 [2024-11-18 19:06:06.921601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.469 #33 NEW cov: 11756 ft: 14011 corp: 21/69b lim: 10 exec/s: 33 rss: 69Mb L: 4/10 MS: 1 CopyPart- 00:07:48.470 [2024-11-18 19:06:06.981725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:48.470 [2024-11-18 19:06:06.981755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.470 [2024-11-18 19:06:06.981784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a7e cdw11:00000000 00:07:48.470 [2024-11-18 19:06:06.981799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.470 #34 NEW cov: 11756 ft: 14048 corp: 22/73b lim: 10 exec/s: 34 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:07:48.470 [2024-11-18 19:06:07.042008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff8a cdw11:00000000 00:07:48.470 [2024-11-18 19:06:07.042037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.470 [2024-11-18 19:06:07.042067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b273 cdw11:00000000 00:07:48.470 [2024-11-18 19:06:07.042082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.470 [2024-11-18 19:06:07.042107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ab1b cdw11:00000000 00:07:48.470 [2024-11-18 19:06:07.042122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.470 [2024-11-18 19:06:07.042147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00009ab0 cdw11:00000000 00:07:48.470 [2024-11-18 19:06:07.042161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.470 [2024-11-18 19:06:07.042187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:48.470 [2024-11-18 19:06:07.042201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.729 #35 NEW cov: 11756 ft: 14069 corp: 23/83b lim: 10 exec/s: 35 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:07:48.729 [2024-11-18 19:06:07.102048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000014ff cdw11:00000000 00:07:48.729 [2024-11-18 19:06:07.102077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.729 [2024-11-18 19:06:07.102106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.729 [2024-11-18 19:06:07.102125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.729 #36 NEW cov: 11756 ft: 14077 corp: 24/88b lim: 10 exec/s: 36 rss: 69Mb L: 5/10 MS: 1 ChangeByte- 00:07:48.729 [2024-11-18 19:06:07.162162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:07:48.729 [2024-11-18 19:06:07.162192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.729 #42 NEW cov: 11756 ft: 14115 corp: 25/91b lim: 10 exec/s: 42 rss: 69Mb L: 3/10 MS: 1 ChangeByte- 00:07:48.729 [2024-11-18 19:06:07.222366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000240a cdw11:00000000 00:07:48.729 [2024-11-18 19:06:07.222397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.729 #44 NEW cov: 11756 ft: 14144 corp: 26/93b lim: 10 exec/s: 44 rss: 70Mb L: 2/10 MS: 2 EraseBytes-InsertByte- 00:07:48.729 [2024-11-18 19:06:07.272503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffab cdw11:00000000 00:07:48.729 [2024-11-18 19:06:07.272532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.729 [2024-11-18 19:06:07.272568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00001b9a cdw11:00000000 00:07:48.729 [2024-11-18 19:06:07.272584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.729 [2024-11-18 19:06:07.272609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000b00a cdw11:00000000 00:07:48.729 [2024-11-18 19:06:07.272624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.729 #45 NEW cov: 11756 ft: 14284 corp: 27/100b lim: 10 exec/s: 45 rss: 70Mb L: 7/10 MS: 1 EraseBytes- 00:07:48.989 [2024-11-18 19:06:07.332661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2b cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.332692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.989 #46 NEW cov: 11756 ft: 14292 corp: 28/102b lim: 10 exec/s: 46 rss: 70Mb L: 2/10 MS: 1 EraseBytes- 00:07:48.989 [2024-11-18 19:06:07.392819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a7c cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.392849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.989 [2024-11-18 19:06:07.392879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002a7c cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.392893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.989 #47 NEW cov: 11756 ft: 14301 corp: 29/106b lim: 10 exec/s: 47 rss: 70Mb L: 4/10 MS: 1 CopyPart- 00:07:48.989 [2024-11-18 19:06:07.452979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a25 cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.453009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.989 [2024-11-18 19:06:07.453038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.453054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.989 #48 NEW cov: 11756 ft: 14344 corp: 30/111b lim: 10 exec/s: 48 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:48.989 [2024-11-18 19:06:07.503090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a25 cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.503125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.989 [2024-11-18 19:06:07.503156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.503171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.989 #49 NEW cov: 11756 ft: 14361 corp: 31/116b lim: 10 exec/s: 49 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:48.989 [2024-11-18 19:06:07.563301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.563332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.989 [2024-11-18 19:06:07.563364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a7c cdw11:00000000 00:07:48.989 [2024-11-18 19:06:07.563381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.249 #50 NEW cov: 11756 ft: 14371 corp: 32/120b lim: 10 exec/s: 25 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:49.249 #50 DONE cov: 11756 ft: 14371 corp: 32/120b lim: 10 exec/s: 25 rss: 70Mb 00:07:49.249 ###### Recommended dictionary. ###### 00:07:49.249 "\377\212\262s\253\023\232\260" # Uses: 0 00:07:49.249 "\377\377\377\017" # Uses: 0 00:07:49.249 "\001\000" # Uses: 0 00:07:49.249 "@\000\000\000\000\000\000\000" # Uses: 0 00:07:49.249 ###### End of recommended dictionary. ###### 00:07:49.249 Done 50 runs in 2 second(s) 00:07:49.249 19:06:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:49.249 19:06:07 -- ../common.sh@72 -- # (( i++ )) 00:07:49.249 19:06:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.249 19:06:07 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:49.249 19:06:07 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:49.249 19:06:07 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.249 19:06:07 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.249 19:06:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:49.249 19:06:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:49.249 19:06:07 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:49.249 19:06:07 -- nvmf/run.sh@29 -- # port=4407 00:07:49.249 19:06:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:49.249 19:06:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:49.249 19:06:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.249 19:06:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:49.249 [2024-11-18 19:06:07.779710] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:49.249 [2024-11-18 19:06:07.779800] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1302328 ] 00:07:49.249 EAL: No free 2048 kB hugepages reported on node 1 00:07:49.508 [2024-11-18 19:06:08.024447] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.768 [2024-11-18 19:06:08.115111] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:49.768 [2024-11-18 19:06:08.115228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.768 [2024-11-18 19:06:08.173212] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:49.768 [2024-11-18 19:06:08.189528] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:49.768 INFO: Running with entropic power schedule (0xFF, 100). 00:07:49.768 INFO: Seed: 4000957038 00:07:49.768 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:49.768 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:49.768 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:49.768 INFO: A corpus is not provided, starting from an empty corpus 00:07:49.768 #2 INITED exec/s: 0 rss: 60Mb 00:07:49.768 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:49.768 This may also happen if the target rejected all inputs we tried so far 00:07:49.768 [2024-11-18 19:06:08.234739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:49.768 [2024-11-18 19:06:08.234769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.026 NEW_FUNC[1/669]: 0x445d08 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:50.027 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.027 #3 NEW cov: 11529 ft: 11530 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:07:50.027 [2024-11-18 19:06:08.535471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.027 [2024-11-18 19:06:08.535502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 #4 NEW cov: 11642 ft: 11985 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:50.027 [2024-11-18 19:06:08.575915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a4c cdw11:00000000 00:07:50.027 [2024-11-18 19:06:08.575942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 [2024-11-18 19:06:08.576006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:50.027 [2024-11-18 19:06:08.576024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.027 [2024-11-18 19:06:08.576088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:50.027 [2024-11-18 19:06:08.576109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.027 [2024-11-18 19:06:08.576175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004c0a cdw11:00000000 00:07:50.027 [2024-11-18 19:06:08.576191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 #5 NEW cov: 11648 ft: 12616 corp: 4/13b lim: 10 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:50.027 [2024-11-18 19:06:08.615638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:50.027 [2024-11-18 19:06:08.615666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 #7 NEW cov: 11733 ft: 12954 corp: 5/15b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 2 ShuffleBytes-InsertByte- 00:07:50.286 [2024-11-18 19:06:08.655798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000acb cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.655826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 #10 NEW cov: 11733 ft: 12998 corp: 6/17b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 3 EraseBytes-CopyPart-InsertByte- 00:07:50.286 [2024-11-18 19:06:08.695886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000500a cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.695917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 #11 NEW cov: 11733 ft: 13178 corp: 7/19b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 1 InsertByte- 00:07:50.286 [2024-11-18 19:06:08.725970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.725997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 #12 NEW cov: 11733 ft: 13252 corp: 8/21b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 1 ChangeBit- 00:07:50.286 [2024-11-18 19:06:08.766098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008a0a cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.766125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 #13 NEW cov: 11733 ft: 13259 corp: 9/23b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 1 ChangeBit- 00:07:50.286 [2024-11-18 19:06:08.796176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f0a cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.796204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 #14 NEW cov: 11733 ft: 13270 corp: 10/26b lim: 10 exec/s: 0 rss: 69Mb L: 3/8 MS: 1 InsertByte- 00:07:50.286 [2024-11-18 19:06:08.826396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000acb cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.826423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 [2024-11-18 19:06:08.826488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.826507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.286 #15 NEW cov: 11733 ft: 13458 corp: 11/30b lim: 10 exec/s: 0 rss: 69Mb L: 4/8 MS: 1 CrossOver- 00:07:50.286 [2024-11-18 19:06:08.866380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000540a cdw11:00000000 00:07:50.286 [2024-11-18 19:06:08.866407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.286 #19 NEW cov: 11733 ft: 13473 corp: 12/32b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 4 EraseBytes-ShuffleBytes-ShuffleBytes-InsertByte- 00:07:50.546 [2024-11-18 19:06:08.907052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005353 cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.907079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 [2024-11-18 19:06:08.907144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005353 cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.907163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.546 [2024-11-18 19:06:08.907226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005353 cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.907243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.546 [2024-11-18 19:06:08.907308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00005353 cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.907324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.546 [2024-11-18 19:06:08.907386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.907402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.546 #20 NEW cov: 11733 ft: 13544 corp: 13/42b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:50.546 [2024-11-18 19:06:08.946633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005454 cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.946660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 #22 NEW cov: 11733 ft: 13664 corp: 14/44b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 2 EraseBytes-CopyPart- 00:07:50.546 [2024-11-18 19:06:08.986885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000acb cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.986912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 [2024-11-18 19:06:08.986978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.546 [2024-11-18 19:06:08.986997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.546 #23 NEW cov: 11733 ft: 13680 corp: 15/48b lim: 10 exec/s: 0 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:07:50.546 [2024-11-18 19:06:09.026853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.546 [2024-11-18 19:06:09.026881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 #24 NEW cov: 11733 ft: 13701 corp: 16/50b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:50.546 [2024-11-18 19:06:09.056952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a7e cdw11:00000000 00:07:50.546 [2024-11-18 19:06:09.056978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 #26 NEW cov: 11733 ft: 13724 corp: 17/52b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 2 EraseBytes-InsertByte- 00:07:50.546 [2024-11-18 19:06:09.087156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:50.546 [2024-11-18 19:06:09.087184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 [2024-11-18 19:06:09.087250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.546 [2024-11-18 19:06:09.087269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.546 #27 NEW cov: 11733 ft: 13736 corp: 18/56b lim: 10 exec/s: 0 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:07:50.546 [2024-11-18 19:06:09.127210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a9d cdw11:00000000 00:07:50.546 [2024-11-18 19:06:09.127237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:50.546 #28 NEW cov: 11756 ft: 13771 corp: 19/58b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 InsertByte- 00:07:50.806 [2024-11-18 19:06:09.157648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.157676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.157742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.157761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.157825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.157849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.157913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.157929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.806 #32 NEW cov: 11756 ft: 13865 corp: 20/67b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 4 EraseBytes-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:07:50.806 [2024-11-18 19:06:09.197747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.197775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.197839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005dff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.197858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.197921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.197941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.198005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.198021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.806 #33 NEW cov: 11756 ft: 13950 corp: 21/76b lim: 10 exec/s: 33 rss: 69Mb L: 9/10 MS: 1 ChangeByte- 00:07:50.806 [2024-11-18 19:06:09.237523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.237555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.806 #34 NEW cov: 11756 ft: 13961 corp: 22/79b lim: 10 exec/s: 34 rss: 69Mb L: 3/10 MS: 1 CopyPart- 00:07:50.806 [2024-11-18 19:06:09.278021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.278047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.278113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.278131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.278198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.278215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.278279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004c0a cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.278295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.806 #35 NEW cov: 11756 ft: 14015 corp: 23/87b lim: 10 exec/s: 35 rss: 69Mb L: 8/10 MS: 1 ChangeByte- 00:07:50.806 [2024-11-18 19:06:09.318112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f00 cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.318139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.318205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.318228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.318293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.318315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.806 [2024-11-18 19:06:09.318379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.318396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.806 #36 NEW cov: 11756 ft: 14048 corp: 24/96b lim: 10 exec/s: 36 rss: 70Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:50.806 [2024-11-18 19:06:09.357875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000740a cdw11:00000000 00:07:50.806 [2024-11-18 19:06:09.357902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.806 #37 NEW cov: 11756 ft: 14061 corp: 25/98b lim: 10 exec/s: 37 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:50.806 [2024-11-18 19:06:09.397952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.807 [2024-11-18 19:06:09.397979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.066 #38 NEW cov: 11756 ft: 14118 corp: 26/101b lim: 10 exec/s: 38 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:51.066 [2024-11-18 19:06:09.438294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.438322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.438388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a4c cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.438408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.066 #39 NEW cov: 11756 ft: 14128 corp: 27/105b lim: 10 exec/s: 39 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:51.066 [2024-11-18 19:06:09.478490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005400 cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.478517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.478590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.478610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.478676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.478695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.066 #40 NEW cov: 11756 ft: 14254 corp: 28/111b lim: 10 exec/s: 40 rss: 70Mb L: 6/10 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:51.066 [2024-11-18 19:06:09.518711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.518738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.518804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efff cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.518824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.518893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.518910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.518974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.518990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.066 #41 NEW cov: 11756 ft: 14311 corp: 29/120b lim: 10 exec/s: 41 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:07:51.066 [2024-11-18 19:06:09.558848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a4c cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.558875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.558940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.558959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.559024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004ccc cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.559043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.559104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004c0a cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.559121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.066 #42 NEW cov: 11756 ft: 14336 corp: 30/128b lim: 10 exec/s: 42 rss: 70Mb L: 8/10 MS: 1 ChangeBit- 00:07:51.066 [2024-11-18 19:06:09.598696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f3ff cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.598722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.598790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffef cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.598811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.066 #46 NEW cov: 11756 ft: 14349 corp: 31/132b lim: 10 exec/s: 46 rss: 70Mb L: 4/10 MS: 4 CrossOver-CopyPart-ChangeByte-CrossOver- 00:07:51.066 [2024-11-18 19:06:09.629168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.629195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.629261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000304c cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.629279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.629344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.629361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.066 [2024-11-18 19:06:09.629425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:51.066 [2024-11-18 19:06:09.629441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.067 [2024-11-18 19:06:09.629508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00004c0a cdw11:00000000 00:07:51.067 [2024-11-18 19:06:09.629524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.067 #47 NEW cov: 11756 ft: 14353 corp: 32/142b lim: 10 exec/s: 47 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:51.326 [2024-11-18 19:06:09.668928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000acb cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.668956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.326 [2024-11-18 19:06:09.669023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005d0a cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.669042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.326 #48 NEW cov: 11756 ft: 14364 corp: 33/147b lim: 10 exec/s: 48 rss: 70Mb L: 5/10 MS: 1 InsertByte- 00:07:51.326 [2024-11-18 19:06:09.709257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003f00 cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.709285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.326 [2024-11-18 19:06:09.709349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.709367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.326 [2024-11-18 19:06:09.709433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.709449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.326 [2024-11-18 19:06:09.709512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.709528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.326 #49 NEW cov: 11756 ft: 14367 corp: 34/156b lim: 10 exec/s: 49 rss: 70Mb L: 9/10 MS: 1 ChangeByte- 00:07:51.326 [2024-11-18 19:06:09.749018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.749045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.326 #50 NEW cov: 11756 ft: 14398 corp: 35/159b lim: 10 exec/s: 50 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:51.326 [2024-11-18 19:06:09.789497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.326 [2024-11-18 19:06:09.789523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.326 [2024-11-18 19:06:09.789594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.789614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.327 [2024-11-18 19:06:09.789679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004ccc cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.789700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.327 [2024-11-18 19:06:09.789759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004c0a cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.789773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.327 #51 NEW cov: 11756 ft: 14410 corp: 36/167b lim: 10 exec/s: 51 rss: 70Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:51.327 [2024-11-18 19:06:09.829496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.829521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.327 [2024-11-18 19:06:09.829577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003535 cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.829590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.327 [2024-11-18 19:06:09.829644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003535 cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.829658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.327 #52 NEW cov: 11756 ft: 14421 corp: 37/173b lim: 10 exec/s: 52 rss: 70Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:07:51.327 [2024-11-18 19:06:09.869380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a8a cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.869405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.327 #53 NEW cov: 11756 ft: 14494 corp: 38/176b lim: 10 exec/s: 53 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:51.327 [2024-11-18 19:06:09.909547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:51.327 [2024-11-18 19:06:09.909580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.327 #55 NEW cov: 11756 ft: 14497 corp: 39/178b lim: 10 exec/s: 55 rss: 70Mb L: 2/10 MS: 2 EraseBytes-CopyPart- 00:07:51.587 [2024-11-18 19:06:09.939847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002183 cdw11:00000000 00:07:51.587 [2024-11-18 19:06:09.939872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:09.939926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008383 cdw11:00000000 00:07:51.587 [2024-11-18 19:06:09.939940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:09.939997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008383 cdw11:00000000 00:07:51.587 [2024-11-18 19:06:09.940011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:09.969675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000210a cdw11:00000000 00:07:51.587 [2024-11-18 19:06:09.969700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.587 #60 NEW cov: 11756 ft: 14503 corp: 40/181b lim: 10 exec/s: 60 rss: 70Mb L: 3/10 MS: 5 ChangeBinInt-ChangeBit-ShuffleBytes-InsertRepeatedBytes-CrossOver- 00:07:51.587 [2024-11-18 19:06:10.000065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005353 cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.000090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:10.000143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005353 cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.000172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:10.000225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005353 cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.000243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.587 #61 NEW cov: 11756 ft: 14521 corp: 41/187b lim: 10 exec/s: 61 rss: 70Mb L: 6/10 MS: 1 EraseBytes- 00:07:51.587 [2024-11-18 19:06:10.039878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.039903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.587 #62 NEW cov: 11756 ft: 14529 corp: 42/190b lim: 10 exec/s: 62 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:51.587 [2024-11-18 19:06:10.080526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a30 cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.080556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:10.080613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000304c cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.080627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:10.080681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.080695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:10.080750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004c4c cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.080764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:10.080818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00004c0a cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.080832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.587 #63 NEW cov: 11756 ft: 14539 corp: 43/200b lim: 10 exec/s: 63 rss: 70Mb L: 10/10 MS: 1 ChangeASCIIInt- 00:07:51.587 [2024-11-18 19:06:10.120280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a8a cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.120307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.587 [2024-11-18 19:06:10.120367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a2c cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.120381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.587 #64 NEW cov: 11756 ft: 14547 corp: 44/204b lim: 10 exec/s: 64 rss: 70Mb L: 4/10 MS: 1 InsertByte- 00:07:51.587 [2024-11-18 19:06:10.160285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000880a cdw11:00000000 00:07:51.587 [2024-11-18 19:06:10.160310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.587 #65 NEW cov: 11756 ft: 14554 corp: 45/206b lim: 10 exec/s: 65 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:51.847 [2024-11-18 19:06:10.200821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.848 [2024-11-18 19:06:10.200847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.848 [2024-11-18 19:06:10.200901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000efff cdw11:00000000 00:07:51.848 [2024-11-18 19:06:10.200915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.848 [2024-11-18 19:06:10.200969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff01 cdw11:00000000 00:07:51.848 [2024-11-18 19:06:10.200983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.848 [2024-11-18 19:06:10.201034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.848 [2024-11-18 19:06:10.201048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.848 #66 NEW cov: 11756 ft: 14566 corp: 46/215b lim: 10 exec/s: 33 rss: 70Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:51.848 #66 DONE cov: 11756 ft: 14566 corp: 46/215b lim: 10 exec/s: 33 rss: 70Mb 00:07:51.848 ###### Recommended dictionary. ###### 00:07:51.848 "\000\000\000\000" # Uses: 0 00:07:51.848 ###### End of recommended dictionary. ###### 00:07:51.848 Done 66 runs in 2 second(s) 00:07:51.848 19:06:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:51.848 19:06:10 -- ../common.sh@72 -- # (( i++ )) 00:07:51.848 19:06:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.848 19:06:10 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:51.848 19:06:10 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:51.848 19:06:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:51.848 19:06:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.848 19:06:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:51.848 19:06:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:51.848 19:06:10 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:51.848 19:06:10 -- nvmf/run.sh@29 -- # port=4408 00:07:51.848 19:06:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:51.848 19:06:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:51.848 19:06:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.848 19:06:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:51.848 [2024-11-18 19:06:10.397478] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:51.848 [2024-11-18 19:06:10.397574] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1302658 ] 00:07:51.848 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.107 [2024-11-18 19:06:10.648446] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.367 [2024-11-18 19:06:10.738907] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.367 [2024-11-18 19:06:10.739036] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.367 [2024-11-18 19:06:10.797565] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.367 [2024-11-18 19:06:10.813893] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:52.367 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.367 INFO: Seed: 2331028204 00:07:52.367 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:52.367 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:52.367 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:52.367 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.367 [2024-11-18 19:06:10.872421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.367 [2024-11-18 19:06:10.872496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.367 #2 INITED cov: 11557 ft: 11558 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:52.367 [2024-11-18 19:06:10.922012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.367 [2024-11-18 19:06:10.922039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.367 #3 NEW cov: 11670 ft: 12136 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:52.367 [2024-11-18 19:06:10.962262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.367 [2024-11-18 19:06:10.962287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.367 [2024-11-18 19:06:10.962343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.367 [2024-11-18 19:06:10.962357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.627 #4 NEW cov: 11676 ft: 12944 corp: 3/4b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CrossOver- 00:07:52.627 [2024-11-18 19:06:11.002222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.002248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.627 #5 NEW cov: 11761 ft: 13219 corp: 4/5b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:52.627 [2024-11-18 19:06:11.042741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.042766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.627 [2024-11-18 19:06:11.042836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.042852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.627 [2024-11-18 19:06:11.042904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.042917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.627 [2024-11-18 19:06:11.042970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.042983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.627 #6 NEW cov: 11761 ft: 13653 corp: 5/9b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:52.627 [2024-11-18 19:06:11.082412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.082437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.627 #7 NEW cov: 11761 ft: 13807 corp: 6/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 CopyPart- 00:07:52.627 [2024-11-18 19:06:11.122665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.122689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.627 [2024-11-18 19:06:11.122747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.627 [2024-11-18 19:06:11.122761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.627 #8 NEW cov: 11761 ft: 14020 corp: 7/12b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 InsertByte- 00:07:52.628 [2024-11-18 19:06:11.162640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.628 [2024-11-18 19:06:11.162664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.628 #9 NEW cov: 11761 ft: 14051 corp: 8/13b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:52.628 [2024-11-18 19:06:11.203351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.628 [2024-11-18 19:06:11.203375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.628 [2024-11-18 19:06:11.203428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.628 [2024-11-18 19:06:11.203444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.628 [2024-11-18 19:06:11.203496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.628 [2024-11-18 19:06:11.203509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.628 [2024-11-18 19:06:11.203564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.628 [2024-11-18 19:06:11.203577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.628 [2024-11-18 19:06:11.203631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.628 [2024-11-18 19:06:11.203644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.894 #10 NEW cov: 11761 ft: 14150 corp: 9/18b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:52.894 [2024-11-18 19:06:11.253061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.894 [2024-11-18 19:06:11.253086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.894 [2024-11-18 19:06:11.253138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.894 [2024-11-18 19:06:11.253154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.894 #11 NEW cov: 11761 ft: 14271 corp: 10/20b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeBit- 00:07:52.894 [2024-11-18 19:06:11.293018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.894 [2024-11-18 19:06:11.293042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.894 #12 NEW cov: 11761 ft: 14339 corp: 11/21b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:52.894 [2024-11-18 19:06:11.333155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.894 [2024-11-18 19:06:11.333182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.894 #13 NEW cov: 11761 ft: 14363 corp: 12/22b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:52.894 [2024-11-18 19:06:11.363233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.894 [2024-11-18 19:06:11.363257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.894 #14 NEW cov: 11761 ft: 14436 corp: 13/23b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CrossOver- 00:07:52.894 [2024-11-18 19:06:11.403517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.895 [2024-11-18 19:06:11.403541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.895 [2024-11-18 19:06:11.403599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.895 [2024-11-18 19:06:11.403613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.895 #15 NEW cov: 11761 ft: 14477 corp: 14/25b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 CopyPart- 00:07:52.895 [2024-11-18 19:06:11.443494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.895 [2024-11-18 19:06:11.443518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.895 #16 NEW cov: 11761 ft: 14511 corp: 15/26b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CrossOver- 00:07:52.895 [2024-11-18 19:06:11.483747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.895 [2024-11-18 19:06:11.483772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.895 [2024-11-18 19:06:11.483825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.895 [2024-11-18 19:06:11.483840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.157 #17 NEW cov: 11761 ft: 14527 corp: 16/28b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 InsertByte- 00:07:53.157 [2024-11-18 19:06:11.524303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.157 [2024-11-18 19:06:11.524327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.158 [2024-11-18 19:06:11.524360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.524373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.158 [2024-11-18 19:06:11.524433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.524446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.158 [2024-11-18 19:06:11.524499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.524515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.158 [2024-11-18 19:06:11.524569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.524582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.158 #18 NEW cov: 11761 ft: 14582 corp: 17/33b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CopyPart- 00:07:53.158 [2024-11-18 19:06:11.563829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.563853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.158 #19 NEW cov: 11761 ft: 14595 corp: 18/34b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 CrossOver- 00:07:53.158 [2024-11-18 19:06:11.593872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.593896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.158 #20 NEW cov: 11761 ft: 14613 corp: 19/35b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:53.158 [2024-11-18 19:06:11.624121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.624145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.158 [2024-11-18 19:06:11.624201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.624214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.158 #21 NEW cov: 11761 ft: 14650 corp: 20/37b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:53.158 [2024-11-18 19:06:11.664264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.664288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.158 [2024-11-18 19:06:11.664359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.664373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.158 #22 NEW cov: 11761 ft: 14658 corp: 21/39b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeByte- 00:07:53.158 [2024-11-18 19:06:11.704198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.704223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.158 #23 NEW cov: 11761 ft: 14701 corp: 22/40b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:53.158 [2024-11-18 19:06:11.744490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.744515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.158 [2024-11-18 19:06:11.744590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.158 [2024-11-18 19:06:11.744607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.676 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.676 #24 NEW cov: 11784 ft: 14743 corp: 23/42b lim: 5 exec/s: 24 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:53.676 [2024-11-18 19:06:12.065483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.676 [2024-11-18 19:06:12.065543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.676 #25 NEW cov: 11784 ft: 14835 corp: 24/43b lim: 5 exec/s: 25 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:53.676 [2024-11-18 19:06:12.115311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.676 [2024-11-18 19:06:12.115337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.676 #26 NEW cov: 11784 ft: 14848 corp: 25/44b lim: 5 exec/s: 26 rss: 68Mb L: 1/5 MS: 1 ChangeByte- 00:07:53.676 [2024-11-18 19:06:12.145367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.676 [2024-11-18 19:06:12.145393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.676 #27 NEW cov: 11784 ft: 14858 corp: 26/45b lim: 5 exec/s: 27 rss: 68Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:53.676 [2024-11-18 19:06:12.185788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.676 [2024-11-18 19:06:12.185813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.676 [2024-11-18 19:06:12.185869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.676 [2024-11-18 19:06:12.185883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.676 [2024-11-18 19:06:12.185936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.676 [2024-11-18 19:06:12.185949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.676 #28 NEW cov: 11784 ft: 15025 corp: 27/48b lim: 5 exec/s: 28 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:53.676 [2024-11-18 19:06:12.226239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.676 [2024-11-18 19:06:12.226264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.226319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.226333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.226388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.226402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.226456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.226474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.226527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.226541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.677 #29 NEW cov: 11784 ft: 15038 corp: 28/53b lim: 5 exec/s: 29 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:53.677 [2024-11-18 19:06:12.276405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.276430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.276486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.276500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.276560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.276574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.276629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.276642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.677 [2024-11-18 19:06:12.276697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.677 [2024-11-18 19:06:12.276712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.948 #30 NEW cov: 11784 ft: 15043 corp: 29/58b lim: 5 exec/s: 30 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:53.948 [2024-11-18 19:06:12.315848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.315872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.948 #31 NEW cov: 11784 ft: 15064 corp: 30/59b lim: 5 exec/s: 31 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:53.948 [2024-11-18 19:06:12.356404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.356429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.356484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.356497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.356554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.356568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.356605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.356621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.948 #32 NEW cov: 11784 ft: 15084 corp: 31/63b lim: 5 exec/s: 32 rss: 69Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:53.948 [2024-11-18 19:06:12.396509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.396533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.396606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.396620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.396674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.396687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.396740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.396753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.948 #33 NEW cov: 11784 ft: 15091 corp: 32/67b lim: 5 exec/s: 33 rss: 69Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:53.948 [2024-11-18 19:06:12.436180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.436205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.948 #34 NEW cov: 11784 ft: 15108 corp: 33/68b lim: 5 exec/s: 34 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:53.948 [2024-11-18 19:06:12.476453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.476478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.476532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.476545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.948 #35 NEW cov: 11784 ft: 15113 corp: 34/70b lim: 5 exec/s: 35 rss: 69Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:53.948 [2024-11-18 19:06:12.516973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.516998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.517051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.517067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.517120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.517136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.517191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.517204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.948 [2024-11-18 19:06:12.517261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.948 [2024-11-18 19:06:12.517274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.948 #36 NEW cov: 11784 ft: 15120 corp: 35/75b lim: 5 exec/s: 36 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:07:54.207 [2024-11-18 19:06:12.556506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.207 [2024-11-18 19:06:12.556530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.207 #37 NEW cov: 11784 ft: 15191 corp: 36/76b lim: 5 exec/s: 37 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:54.207 [2024-11-18 19:06:12.596784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.596808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.596862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.596876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.208 #38 NEW cov: 11784 ft: 15202 corp: 37/78b lim: 5 exec/s: 38 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:54.208 [2024-11-18 19:06:12.637190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.637214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.637268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.637283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.637336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.637349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.637403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.637416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.208 #39 NEW cov: 11784 ft: 15217 corp: 38/82b lim: 5 exec/s: 39 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:54.208 [2024-11-18 19:06:12.677024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.677048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.677103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.677119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.208 #40 NEW cov: 11784 ft: 15238 corp: 39/84b lim: 5 exec/s: 40 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:54.208 [2024-11-18 19:06:12.717393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.717418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.717472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.717487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.717541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.717558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.717614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.717627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.208 #41 NEW cov: 11784 ft: 15286 corp: 40/88b lim: 5 exec/s: 41 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:07:54.208 [2024-11-18 19:06:12.757417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.757442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.757497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.757511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.757571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.757585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.208 #42 NEW cov: 11784 ft: 15292 corp: 41/91b lim: 5 exec/s: 42 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:54.208 [2024-11-18 19:06:12.797656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.797680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.797737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.797750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.797805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.797818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.208 [2024-11-18 19:06:12.797874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.208 [2024-11-18 19:06:12.797888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.468 #43 NEW cov: 11784 ft: 15297 corp: 42/95b lim: 5 exec/s: 43 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:54.468 [2024-11-18 19:06:12.837464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.468 [2024-11-18 19:06:12.837489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.468 [2024-11-18 19:06:12.837542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.468 [2024-11-18 19:06:12.837558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.468 #44 NEW cov: 11784 ft: 15299 corp: 43/97b lim: 5 exec/s: 22 rss: 69Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:54.468 #44 DONE cov: 11784 ft: 15299 corp: 43/97b lim: 5 exec/s: 22 rss: 69Mb 00:07:54.468 Done 44 runs in 2 second(s) 00:07:54.468 19:06:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:54.468 19:06:12 -- ../common.sh@72 -- # (( i++ )) 00:07:54.468 19:06:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.468 19:06:12 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:54.468 19:06:12 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:54.468 19:06:12 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.468 19:06:12 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.468 19:06:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:54.468 19:06:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:54.468 19:06:12 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:54.468 19:06:12 -- nvmf/run.sh@29 -- # port=4409 00:07:54.468 19:06:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:54.468 19:06:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:54.468 19:06:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.468 19:06:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:54.468 [2024-11-18 19:06:13.022973] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:54.468 [2024-11-18 19:06:13.023046] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1303159 ] 00:07:54.468 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.728 [2024-11-18 19:06:13.273078] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.987 [2024-11-18 19:06:13.354851] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.987 [2024-11-18 19:06:13.354989] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.987 [2024-11-18 19:06:13.413180] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.987 [2024-11-18 19:06:13.429494] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:54.987 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.987 INFO: Seed: 652040008 00:07:54.987 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:54.987 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:54.987 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:54.987 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.987 [2024-11-18 19:06:13.506296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.987 [2024-11-18 19:06:13.506332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.987 #2 INITED cov: 11557 ft: 11558 corp: 1/1b exec/s: 0 rss: 66Mb 00:07:54.987 [2024-11-18 19:06:13.556370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.987 [2024-11-18 19:06:13.556399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.987 #3 NEW cov: 11670 ft: 12152 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:07:55.247 [2024-11-18 19:06:13.617004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.247 [2024-11-18 19:06:13.617031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.247 [2024-11-18 19:06:13.617123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.247 [2024-11-18 19:06:13.617140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.247 #4 NEW cov: 11676 ft: 12978 corp: 3/4b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:55.248 [2024-11-18 19:06:13.666956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.666984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.248 #5 NEW cov: 11761 ft: 13260 corp: 4/5b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeBinInt- 00:07:55.248 [2024-11-18 19:06:13.727141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.727168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.248 #6 NEW cov: 11761 ft: 13356 corp: 5/6b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 CrossOver- 00:07:55.248 [2024-11-18 19:06:13.778719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.778747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.248 [2024-11-18 19:06:13.778830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.778847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.248 [2024-11-18 19:06:13.778920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.778936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.248 [2024-11-18 19:06:13.779008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.779024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.248 [2024-11-18 19:06:13.779106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.779122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.248 #7 NEW cov: 11761 ft: 13824 corp: 6/11b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:55.248 [2024-11-18 19:06:13.827593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.248 [2024-11-18 19:06:13.827619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.508 #8 NEW cov: 11761 ft: 13916 corp: 7/12b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:55.508 [2024-11-18 19:06:13.879319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.879344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.879421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.879436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.879510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.879525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.879667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.879684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.879753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.879769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.508 #9 NEW cov: 11761 ft: 13980 corp: 8/17b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CopyPart- 00:07:55.508 [2024-11-18 19:06:13.938097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.938123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.508 #10 NEW cov: 11761 ft: 14026 corp: 9/18b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:55.508 [2024-11-18 19:06:13.989723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.989750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.989828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.989844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.989916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.989936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.990013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.990029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:13.990102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:13.990117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.508 #11 NEW cov: 11761 ft: 14056 corp: 10/23b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:55.508 [2024-11-18 19:06:14.050101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:14.050127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:14.050205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:14.050220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:14.050291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:14.050306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:14.050379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:14.050392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:14.050463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:14.050478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.508 #12 NEW cov: 11761 ft: 14090 corp: 11/28b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:55.508 [2024-11-18 19:06:14.099119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:14.099146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.508 [2024-11-18 19:06:14.099221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.508 [2024-11-18 19:06:14.099237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.768 #13 NEW cov: 11761 ft: 14150 corp: 12/30b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:07:55.768 [2024-11-18 19:06:14.160413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.160439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.160514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.160534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.160636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.160653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.160731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.160746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.160829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.160845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.768 #14 NEW cov: 11761 ft: 14168 corp: 13/35b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ShuffleBytes- 00:07:55.768 [2024-11-18 19:06:14.210667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.210693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.210771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.210785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.210857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.210872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.210945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.210960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.768 [2024-11-18 19:06:14.211030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.768 [2024-11-18 19:06:14.211046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.768 #15 NEW cov: 11761 ft: 14200 corp: 14/40b lim: 5 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:55.769 [2024-11-18 19:06:14.259891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.769 [2024-11-18 19:06:14.259917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.769 [2024-11-18 19:06:14.260001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.769 [2024-11-18 19:06:14.260016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.769 #16 NEW cov: 11761 ft: 14316 corp: 15/42b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:55.769 [2024-11-18 19:06:14.320095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.769 [2024-11-18 19:06:14.320124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.769 [2024-11-18 19:06:14.320201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.769 [2024-11-18 19:06:14.320217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.769 #17 NEW cov: 11761 ft: 14326 corp: 16/44b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:55.769 [2024-11-18 19:06:14.369927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.769 [2024-11-18 19:06:14.369954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.287 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:56.287 #18 NEW cov: 11784 ft: 14410 corp: 17/45b lim: 5 exec/s: 18 rss: 69Mb L: 1/5 MS: 1 ChangeByte- 00:07:56.287 [2024-11-18 19:06:14.680555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.287 [2024-11-18 19:06:14.680590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.287 [2024-11-18 19:06:14.680706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.287 [2024-11-18 19:06:14.680721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.287 [2024-11-18 19:06:14.680842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.287 [2024-11-18 19:06:14.680859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.287 #19 NEW cov: 11784 ft: 14738 corp: 18/48b lim: 5 exec/s: 19 rss: 69Mb L: 3/5 MS: 1 EraseBytes- 00:07:56.287 [2024-11-18 19:06:14.721092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.287 [2024-11-18 19:06:14.721120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.287 [2024-11-18 19:06:14.721233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.287 [2024-11-18 19:06:14.721250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.287 [2024-11-18 19:06:14.721369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.287 [2024-11-18 19:06:14.721387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.287 [2024-11-18 19:06:14.721499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.287 [2024-11-18 19:06:14.721516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.287 [2024-11-18 19:06:14.721643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.721665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.288 #20 NEW cov: 11784 ft: 14798 corp: 19/53b lim: 5 exec/s: 20 rss: 69Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:56.288 [2024-11-18 19:06:14.770528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.770559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.288 [2024-11-18 19:06:14.770669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.770688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.288 #21 NEW cov: 11784 ft: 14814 corp: 20/55b lim: 5 exec/s: 21 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:07:56.288 [2024-11-18 19:06:14.810968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.810996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.288 [2024-11-18 19:06:14.811113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.811133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.288 [2024-11-18 19:06:14.811251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.811270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.288 #22 NEW cov: 11784 ft: 14829 corp: 21/58b lim: 5 exec/s: 22 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:56.288 [2024-11-18 19:06:14.860763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.860790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.288 [2024-11-18 19:06:14.860908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.288 [2024-11-18 19:06:14.860924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.288 #23 NEW cov: 11784 ft: 14834 corp: 22/60b lim: 5 exec/s: 23 rss: 69Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:56.547 [2024-11-18 19:06:14.900946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.547 [2024-11-18 19:06:14.900972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.547 [2024-11-18 19:06:14.901101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.547 [2024-11-18 19:06:14.901119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.547 #24 NEW cov: 11784 ft: 14849 corp: 23/62b lim: 5 exec/s: 24 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:56.547 [2024-11-18 19:06:14.951372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.547 [2024-11-18 19:06:14.951400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.547 [2024-11-18 19:06:14.951532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.547 [2024-11-18 19:06:14.951555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.547 [2024-11-18 19:06:14.951667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.547 [2024-11-18 19:06:14.951683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.547 #25 NEW cov: 11784 ft: 14887 corp: 24/65b lim: 5 exec/s: 25 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:56.547 [2024-11-18 19:06:14.991191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.547 [2024-11-18 19:06:14.991220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.547 [2024-11-18 19:06:14.991342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.547 [2024-11-18 19:06:14.991362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.547 #26 NEW cov: 11784 ft: 14898 corp: 25/67b lim: 5 exec/s: 26 rss: 69Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:56.548 [2024-11-18 19:06:15.041030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.041059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.548 #27 NEW cov: 11784 ft: 14935 corp: 26/68b lim: 5 exec/s: 27 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:56.548 [2024-11-18 19:06:15.082271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.082297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.082406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.082423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.082543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.082561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.082680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.082698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.082815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.082832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.548 #28 NEW cov: 11784 ft: 14957 corp: 27/73b lim: 5 exec/s: 28 rss: 70Mb L: 5/5 MS: 1 ChangeBit- 00:07:56.548 [2024-11-18 19:06:15.132419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.132447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.132566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.132584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.132702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.132720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.132839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.132856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.548 [2024-11-18 19:06:15.132971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.548 [2024-11-18 19:06:15.132990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.807 #29 NEW cov: 11784 ft: 15038 corp: 28/78b lim: 5 exec/s: 29 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:56.807 [2024-11-18 19:06:15.171739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.807 [2024-11-18 19:06:15.171767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.807 [2024-11-18 19:06:15.171885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.807 [2024-11-18 19:06:15.171902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.807 #30 NEW cov: 11784 ft: 15044 corp: 29/80b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:56.807 [2024-11-18 19:06:15.222276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.807 [2024-11-18 19:06:15.222303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.807 [2024-11-18 19:06:15.222422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.807 [2024-11-18 19:06:15.222440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.807 [2024-11-18 19:06:15.222564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.807 [2024-11-18 19:06:15.222583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.807 #31 NEW cov: 11784 ft: 15050 corp: 30/83b lim: 5 exec/s: 31 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:56.807 [2024-11-18 19:06:15.261762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.807 [2024-11-18 19:06:15.261789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.807 #32 NEW cov: 11784 ft: 15062 corp: 31/84b lim: 5 exec/s: 32 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:56.808 [2024-11-18 19:06:15.312737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.312765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.312888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.312904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.313018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.313034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.313150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.313169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.808 #33 NEW cov: 11784 ft: 15119 corp: 32/88b lim: 5 exec/s: 33 rss: 70Mb L: 4/5 MS: 1 CrossOver- 00:07:56.808 [2024-11-18 19:06:15.363056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.363082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.363192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.363208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.363324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.363341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.363459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.363476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.363596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.363614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.808 #34 NEW cov: 11784 ft: 15155 corp: 33/93b lim: 5 exec/s: 34 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:56.808 [2024-11-18 19:06:15.403243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.403271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.403388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.403407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.403521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.403539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.403667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.403684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.808 [2024-11-18 19:06:15.403802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.808 [2024-11-18 19:06:15.403819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.067 #35 NEW cov: 11784 ft: 15167 corp: 34/98b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:57.067 [2024-11-18 19:06:15.452368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.067 [2024-11-18 19:06:15.452393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.067 #36 NEW cov: 11784 ft: 15190 corp: 35/99b lim: 5 exec/s: 18 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:57.067 #36 DONE cov: 11784 ft: 15190 corp: 35/99b lim: 5 exec/s: 18 rss: 70Mb 00:07:57.068 Done 36 runs in 2 second(s) 00:07:57.068 19:06:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:57.068 19:06:15 -- ../common.sh@72 -- # (( i++ )) 00:07:57.068 19:06:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.068 19:06:15 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:57.068 19:06:15 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:57.068 19:06:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.068 19:06:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.068 19:06:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:57.068 19:06:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:57.068 19:06:15 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:57.068 19:06:15 -- nvmf/run.sh@29 -- # port=4410 00:07:57.068 19:06:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:57.068 19:06:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:57.068 19:06:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.068 19:06:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:57.068 [2024-11-18 19:06:15.641779] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:57.068 [2024-11-18 19:06:15.641855] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1303700 ] 00:07:57.328 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.328 [2024-11-18 19:06:15.819937] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.328 [2024-11-18 19:06:15.883048] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.328 [2024-11-18 19:06:15.883185] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.587 [2024-11-18 19:06:15.942132] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.587 [2024-11-18 19:06:15.958440] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:57.587 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.587 INFO: Seed: 3181044327 00:07:57.587 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:57.587 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:57.587 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:57.587 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.587 #2 INITED exec/s: 0 rss: 60Mb 00:07:57.587 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.587 This may also happen if the target rejected all inputs we tried so far 00:07:57.587 [2024-11-18 19:06:16.003707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e45b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.587 [2024-11-18 19:06:16.003735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.846 NEW_FUNC[1/670]: 0x447688 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:57.846 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.846 #15 NEW cov: 11580 ft: 11581 corp: 2/11b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 3 ChangeByte-InsertByte-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:57.846 [2024-11-18 19:06:16.324484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.846 [2024-11-18 19:06:16.324515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.847 [2024-11-18 19:06:16.324577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.847 [2024-11-18 19:06:16.324592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.847 #16 NEW cov: 11693 ft: 12307 corp: 3/28b lim: 40 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:07:57.847 [2024-11-18 19:06:16.364414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.847 [2024-11-18 19:06:16.364441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.847 #22 NEW cov: 11699 ft: 12557 corp: 4/37b lim: 40 exec/s: 0 rss: 69Mb L: 9/17 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:57.847 [2024-11-18 19:06:16.394474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.847 [2024-11-18 19:06:16.394501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.847 #23 NEW cov: 11784 ft: 12850 corp: 5/47b lim: 40 exec/s: 0 rss: 69Mb L: 10/17 MS: 1 InsertByte- 00:07:57.847 [2024-11-18 19:06:16.434761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.847 [2024-11-18 19:06:16.434786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.847 [2024-11-18 19:06:16.434843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.847 [2024-11-18 19:06:16.434856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.106 #24 NEW cov: 11784 ft: 12886 corp: 6/64b lim: 40 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:58.106 [2024-11-18 19:06:16.474742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.474768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.106 #25 NEW cov: 11784 ft: 13028 corp: 7/74b lim: 40 exec/s: 0 rss: 69Mb L: 10/17 MS: 1 CrossOver- 00:07:58.106 [2024-11-18 19:06:16.514837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a090000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.514863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.106 #31 NEW cov: 11784 ft: 13108 corp: 8/83b lim: 40 exec/s: 0 rss: 69Mb L: 9/17 MS: 1 ChangeBinInt- 00:07:58.106 [2024-11-18 19:06:16.554916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000031 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.554942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.106 #32 NEW cov: 11784 ft: 13170 corp: 9/93b lim: 40 exec/s: 0 rss: 69Mb L: 10/17 MS: 1 InsertByte- 00:07:58.106 [2024-11-18 19:06:16.585048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:e45b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.585073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.106 #33 NEW cov: 11784 ft: 13207 corp: 10/103b lim: 40 exec/s: 0 rss: 69Mb L: 10/17 MS: 1 ShuffleBytes- 00:07:58.106 [2024-11-18 19:06:16.625114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00310000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.625139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.106 #34 NEW cov: 11784 ft: 13238 corp: 11/111b lim: 40 exec/s: 0 rss: 69Mb L: 8/17 MS: 1 EraseBytes- 00:07:58.106 [2024-11-18 19:06:16.665249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.665274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.106 #35 NEW cov: 11784 ft: 13295 corp: 12/120b lim: 40 exec/s: 0 rss: 69Mb L: 9/17 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:58.106 [2024-11-18 19:06:16.695485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:f72bc279 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.695509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.106 [2024-11-18 19:06:16.695583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b28b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.106 [2024-11-18 19:06:16.695598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.365 #36 NEW cov: 11784 ft: 13334 corp: 13/137b lim: 40 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 CMP- DE: "s\367+\302y\262\213\000"- 00:07:58.365 [2024-11-18 19:06:16.735573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:f72b29c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.735599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.365 [2024-11-18 19:06:16.735654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:79b28b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.735671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.365 #37 NEW cov: 11784 ft: 13352 corp: 14/155b lim: 40 exec/s: 0 rss: 69Mb L: 18/18 MS: 1 InsertByte- 00:07:58.365 [2024-11-18 19:06:16.775826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.775852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.365 [2024-11-18 19:06:16.775909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.775924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.365 [2024-11-18 19:06:16.775979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.775993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.365 #38 NEW cov: 11784 ft: 13612 corp: 15/180b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:58.365 [2024-11-18 19:06:16.815813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:f72bc279 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.815838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.365 [2024-11-18 19:06:16.815894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b28b0000 cdw11:0000002a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.815908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.365 #39 NEW cov: 11784 ft: 13625 corp: 16/197b lim: 40 exec/s: 0 rss: 69Mb L: 17/25 MS: 1 ChangeByte- 00:07:58.365 [2024-11-18 19:06:16.855921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a002900 cdw11:73f72bc2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.855946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.365 [2024-11-18 19:06:16.856003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:79b28b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.856016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.365 #40 NEW cov: 11784 ft: 13683 corp: 17/215b lim: 40 exec/s: 0 rss: 69Mb L: 18/25 MS: 1 InsertByte- 00:07:58.365 [2024-11-18 19:06:16.885897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a00c279 cdw11:b28b0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.365 [2024-11-18 19:06:16.885923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.366 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.366 #41 NEW cov: 11807 ft: 13757 corp: 18/228b lim: 40 exec/s: 0 rss: 70Mb L: 13/25 MS: 1 EraseBytes- 00:07:58.366 [2024-11-18 19:06:16.926029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:f72bc279 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.366 [2024-11-18 19:06:16.926055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.366 #42 NEW cov: 11807 ft: 13805 corp: 19/239b lim: 40 exec/s: 0 rss: 70Mb L: 11/25 MS: 1 EraseBytes- 00:07:58.366 [2024-11-18 19:06:16.966109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.366 [2024-11-18 19:06:16.966135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.625 #43 NEW cov: 11807 ft: 13834 corp: 20/247b lim: 40 exec/s: 0 rss: 70Mb L: 8/25 MS: 1 EraseBytes- 00:07:58.625 [2024-11-18 19:06:17.006710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2bc2bdbd cdw11:bdbdbdbd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.625 [2024-11-18 19:06:17.006736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.625 [2024-11-18 19:06:17.006792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:bdbdbdbd cdw11:bdbdbdbd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.625 [2024-11-18 19:06:17.006806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.625 [2024-11-18 19:06:17.006861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:bdbdbdbd cdw11:bdbdbdbd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.006875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.006928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:bdbdbdbd cdw11:bdbdbdbd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.006941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.006996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:bdbdbdbd cdw11:79b20000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.007009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.626 #45 NEW cov: 11807 ft: 14394 corp: 21/287b lim: 40 exec/s: 45 rss: 70Mb L: 40/40 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:58.626 [2024-11-18 19:06:17.056365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.056390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.626 #46 NEW cov: 11807 ft: 14465 corp: 22/298b lim: 40 exec/s: 46 rss: 70Mb L: 11/40 MS: 1 InsertByte- 00:07:58.626 [2024-11-18 19:06:17.096480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:30000c00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.096505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.626 #47 NEW cov: 11807 ft: 14515 corp: 23/306b lim: 40 exec/s: 47 rss: 70Mb L: 8/40 MS: 1 EraseBytes- 00:07:58.626 [2024-11-18 19:06:17.136966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.136991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.137048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.137062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.137116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.137133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.137187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.137200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.626 #48 NEW cov: 11807 ft: 14550 corp: 24/342b lim: 40 exec/s: 48 rss: 70Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:07:58.626 [2024-11-18 19:06:17.176888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.176913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.176970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.176984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.177039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00300000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.177053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.177108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000c0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.177122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.626 #49 NEW cov: 11807 ft: 14594 corp: 25/378b lim: 40 exec/s: 49 rss: 70Mb L: 36/40 MS: 1 CrossOver- 00:07:58.626 [2024-11-18 19:06:17.216925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:73f72bc2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.216950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.626 [2024-11-18 19:06:17.217006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:79b28b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.626 [2024-11-18 19:06:17.217020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.885 #50 NEW cov: 11807 ft: 14598 corp: 26/395b lim: 40 exec/s: 50 rss: 70Mb L: 17/40 MS: 1 PersAutoDict- DE: "s\367+\302y\262\213\000"- 00:07:58.885 [2024-11-18 19:06:17.257012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:f72b29c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.257036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.885 [2024-11-18 19:06:17.257091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:79b28b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.257105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.885 #51 NEW cov: 11807 ft: 14613 corp: 27/413b lim: 40 exec/s: 51 rss: 70Mb L: 18/40 MS: 1 ChangeByte- 00:07:58.885 [2024-11-18 19:06:17.297133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:f72bc259 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.297157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.885 [2024-11-18 19:06:17.297216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b28b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.297229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.885 #52 NEW cov: 11807 ft: 14617 corp: 28/430b lim: 40 exec/s: 52 rss: 70Mb L: 17/40 MS: 1 ChangeBit- 00:07:58.885 [2024-11-18 19:06:17.327362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00310000 cdw11:00efefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.327387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.885 [2024-11-18 19:06:17.327444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.327457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.885 [2024-11-18 19:06:17.327511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:efefefef cdw11:efefef00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.327525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.885 #53 NEW cov: 11807 ft: 14625 corp: 29/456b lim: 40 exec/s: 53 rss: 70Mb L: 26/40 MS: 1 InsertRepeatedBytes- 00:07:58.885 [2024-11-18 19:06:17.367251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ae45b00 cdw11:00000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.367277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.885 #54 NEW cov: 11807 ft: 14645 corp: 30/467b lim: 40 exec/s: 54 rss: 70Mb L: 11/40 MS: 1 CrossOver- 00:07:58.885 [2024-11-18 19:06:17.407475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a002900 cdw11:73f72bc2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.407500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.885 [2024-11-18 19:06:17.407555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:79b28b02 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.885 [2024-11-18 19:06:17.407570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.886 #55 NEW cov: 11807 ft: 14656 corp: 31/485b lim: 40 exec/s: 55 rss: 70Mb L: 18/40 MS: 1 ChangeBit- 00:07:58.886 [2024-11-18 19:06:17.447708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.886 [2024-11-18 19:06:17.447733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.886 [2024-11-18 19:06:17.447788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.886 [2024-11-18 19:06:17.447803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.886 [2024-11-18 19:06:17.447860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.886 [2024-11-18 19:06:17.447874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.886 #56 NEW cov: 11807 ft: 14677 corp: 32/515b lim: 40 exec/s: 56 rss: 70Mb L: 30/40 MS: 1 EraseBytes- 00:07:59.145 [2024-11-18 19:06:17.487738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:312bc279 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.487768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.487824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b28b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.487838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.145 #57 NEW cov: 11807 ft: 14705 corp: 33/532b lim: 40 exec/s: 57 rss: 70Mb L: 17/40 MS: 1 ChangeByte- 00:07:59.145 [2024-11-18 19:06:17.527946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a008bb2 cdw11:7a3c967b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.527971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.528029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:04000073 cdw11:f72b29c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.528043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.528099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:79b28b00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.528113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.145 #58 NEW cov: 11807 ft: 14720 corp: 34/558b lim: 40 exec/s: 58 rss: 70Mb L: 26/40 MS: 1 CMP- DE: "\000\213\262z<\226{\004"- 00:07:59.145 [2024-11-18 19:06:17.568148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.568173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.568230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.568243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.568299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.568312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.568366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.568380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.145 #59 NEW cov: 11807 ft: 14730 corp: 35/594b lim: 40 exec/s: 59 rss: 70Mb L: 36/40 MS: 1 ChangeBinInt- 00:07:59.145 [2024-11-18 19:06:17.608175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00310040 cdw11:00efefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.608200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.608257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.608270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.608327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:efefefef cdw11:efefef00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.608341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.145 #60 NEW cov: 11807 ft: 14752 corp: 36/620b lim: 40 exec/s: 60 rss: 70Mb L: 26/40 MS: 1 ChangeByte- 00:07:59.145 [2024-11-18 19:06:17.648272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00310000 cdw11:00efefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.648297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.648355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.648369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.648422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:efefefef cdw11:efefef00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.648436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.145 #61 NEW cov: 11807 ft: 14765 corp: 37/646b lim: 40 exec/s: 61 rss: 70Mb L: 26/40 MS: 1 ShuffleBytes- 00:07:59.145 [2024-11-18 19:06:17.688284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a002f73 cdw11:f72bc279 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.688309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.145 [2024-11-18 19:06:17.688365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:b28b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.688379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.145 #62 NEW cov: 11807 ft: 14791 corp: 38/663b lim: 40 exec/s: 62 rss: 70Mb L: 17/40 MS: 1 ChangeByte- 00:07:59.145 [2024-11-18 19:06:17.718282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000073 cdw11:0073f72b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.145 [2024-11-18 19:06:17.718307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.145 #64 NEW cov: 11807 ft: 14829 corp: 39/672b lim: 40 exec/s: 64 rss: 70Mb L: 9/40 MS: 2 EraseBytes-CopyPart- 00:07:59.405 [2024-11-18 19:06:17.758586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00310000 cdw11:00efefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.758611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.758668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:efefefef cdw11:efefefef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.758681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.758735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:efefefef cdw11:efef4000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.758749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 #65 NEW cov: 11807 ft: 14843 corp: 40/698b lim: 40 exec/s: 65 rss: 70Mb L: 26/40 MS: 1 ChangeByte- 00:07:59.405 [2024-11-18 19:06:17.798836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.798868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.798925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.798939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.798991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00301000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.799005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.799059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000c0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.799072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.405 #66 NEW cov: 11807 ft: 14855 corp: 41/734b lim: 40 exec/s: 66 rss: 70Mb L: 36/40 MS: 1 ChangeBit- 00:07:59.405 [2024-11-18 19:06:17.838593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:005b000a cdw11:e4000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.838618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 #67 NEW cov: 11807 ft: 14888 corp: 42/745b lim: 40 exec/s: 67 rss: 70Mb L: 11/40 MS: 1 ShuffleBytes- 00:07:59.405 [2024-11-18 19:06:17.879112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.879138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.879193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff01 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.879207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.879261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.879274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.879327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.879341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.405 #68 NEW cov: 11807 ft: 14917 corp: 43/783b lim: 40 exec/s: 68 rss: 70Mb L: 38/40 MS: 1 CMP- DE: "\001\000"- 00:07:59.405 [2024-11-18 19:06:17.919166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.919192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.919245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.919259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.919318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:d8000000 cdw11:00003000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.919332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.919388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000c00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.919401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.405 #69 NEW cov: 11807 ft: 14927 corp: 44/820b lim: 40 exec/s: 69 rss: 70Mb L: 37/40 MS: 1 InsertByte- 00:07:59.405 [2024-11-18 19:06:17.959282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.959307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.405 [2024-11-18 19:06:17.959364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.405 [2024-11-18 19:06:17.959378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.406 [2024-11-18 19:06:17.959431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00300000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.406 [2024-11-18 19:06:17.959446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.406 [2024-11-18 19:06:17.959500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:002c0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.406 [2024-11-18 19:06:17.959513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.406 #70 NEW cov: 11807 ft: 14935 corp: 45/856b lim: 40 exec/s: 70 rss: 70Mb L: 36/40 MS: 1 ChangeBit- 00:07:59.406 [2024-11-18 19:06:17.999148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.406 [2024-11-18 19:06:17.999174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.406 [2024-11-18 19:06:17.999230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.406 [2024-11-18 19:06:17.999244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.665 #71 NEW cov: 11807 ft: 14941 corp: 46/875b lim: 40 exec/s: 35 rss: 70Mb L: 19/40 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:59.665 #71 DONE cov: 11807 ft: 14941 corp: 46/875b lim: 40 exec/s: 35 rss: 70Mb 00:07:59.665 ###### Recommended dictionary. ###### 00:07:59.665 "\000\000\000\000\000\000\000\000" # Uses: 4 00:07:59.665 "s\367+\302y\262\213\000" # Uses: 1 00:07:59.665 "\000\213\262z<\226{\004" # Uses: 0 00:07:59.665 "\001\000" # Uses: 1 00:07:59.665 ###### End of recommended dictionary. ###### 00:07:59.665 Done 71 runs in 2 second(s) 00:07:59.665 19:06:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:59.665 19:06:18 -- ../common.sh@72 -- # (( i++ )) 00:07:59.665 19:06:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.665 19:06:18 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:59.665 19:06:18 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:59.665 19:06:18 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.665 19:06:18 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.665 19:06:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:59.665 19:06:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:59.665 19:06:18 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:59.665 19:06:18 -- nvmf/run.sh@29 -- # port=4411 00:07:59.665 19:06:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:59.665 19:06:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:59.665 19:06:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.665 19:06:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:59.665 [2024-11-18 19:06:18.184865] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:59.665 [2024-11-18 19:06:18.184930] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1304129 ] 00:07:59.665 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.925 [2024-11-18 19:06:18.365402] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.925 [2024-11-18 19:06:18.428284] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:59.925 [2024-11-18 19:06:18.428422] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.925 [2024-11-18 19:06:18.486400] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.925 [2024-11-18 19:06:18.502717] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:59.925 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.925 INFO: Seed: 1431058831 00:08:00.184 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:00.184 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:00.184 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.184 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.184 #2 INITED exec/s: 0 rss: 60Mb 00:08:00.184 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.184 This may also happen if the target rejected all inputs we tried so far 00:08:00.184 [2024-11-18 19:06:18.558177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0822 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.184 [2024-11-18 19:06:18.558205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.184 [2024-11-18 19:06:18.558265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.184 [2024-11-18 19:06:18.558279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.443 NEW_FUNC[1/671]: 0x4493f8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:00.443 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.443 #7 NEW cov: 11585 ft: 11586 corp: 2/22b lim: 40 exec/s: 0 rss: 68Mb L: 21/21 MS: 5 ChangeBit-CopyPart-CrossOver-CrossOver-InsertRepeatedBytes- 00:08:00.443 [2024-11-18 19:06:18.889739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:18.889779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.443 [2024-11-18 19:06:18.889912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:18.889935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.443 #10 NEW cov: 11705 ft: 12366 corp: 3/41b lim: 40 exec/s: 0 rss: 68Mb L: 19/21 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:00.443 [2024-11-18 19:06:18.929497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:18.929529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.443 [2024-11-18 19:06:18.929652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:18.929671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.443 #11 NEW cov: 11711 ft: 12590 corp: 4/63b lim: 40 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 CrossOver- 00:08:00.443 [2024-11-18 19:06:18.980008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:18.980038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.443 [2024-11-18 19:06:18.980181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:18.980201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.443 #12 NEW cov: 11796 ft: 12964 corp: 5/85b lim: 40 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 ChangeBinInt- 00:08:00.443 [2024-11-18 19:06:19.030159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f20e0707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:19.030188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.443 [2024-11-18 19:06:19.030313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07073a07 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.443 [2024-11-18 19:06:19.030333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.703 #17 NEW cov: 11796 ft: 13121 corp: 6/106b lim: 40 exec/s: 0 rss: 68Mb L: 21/22 MS: 5 ShuffleBytes-InsertByte-ChangeByte-ChangeBinInt-CrossOver- 00:08:00.703 [2024-11-18 19:06:19.070239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0822 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.703 [2024-11-18 19:06:19.070270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.703 [2024-11-18 19:06:19.070396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.703 [2024-11-18 19:06:19.070413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.703 #18 NEW cov: 11796 ft: 13157 corp: 7/127b lim: 40 exec/s: 0 rss: 68Mb L: 21/22 MS: 1 ShuffleBytes- 00:08:00.703 [2024-11-18 19:06:19.110397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0824 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.703 [2024-11-18 19:06:19.110423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.703 [2024-11-18 19:06:19.110553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.703 [2024-11-18 19:06:19.110570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.703 #19 NEW cov: 11796 ft: 13201 corp: 8/148b lim: 40 exec/s: 0 rss: 68Mb L: 21/22 MS: 1 ChangeByte- 00:08:00.703 [2024-11-18 19:06:19.151060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.703 [2024-11-18 19:06:19.151089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.151218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.151236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.151365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:222222ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.151382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.151505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.151522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.704 #20 NEW cov: 11796 ft: 13621 corp: 9/184b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:00.704 [2024-11-18 19:06:19.201204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0822 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.201232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.201366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.201382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.201506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.201522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.201652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.201669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.704 #21 NEW cov: 11796 ft: 13711 corp: 10/216b lim: 40 exec/s: 0 rss: 68Mb L: 32/36 MS: 1 CopyPart- 00:08:00.704 [2024-11-18 19:06:19.240806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.240833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.240959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.240980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.704 #22 NEW cov: 11796 ft: 13753 corp: 11/238b lim: 40 exec/s: 0 rss: 68Mb L: 22/36 MS: 1 ChangeBit- 00:08:00.704 [2024-11-18 19:06:19.280938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0b464646 cdw11:46464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.280967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.704 [2024-11-18 19:06:19.281104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:46464646 cdw11:46464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.704 [2024-11-18 19:06:19.281122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.704 #24 NEW cov: 11796 ft: 13796 corp: 12/261b lim: 40 exec/s: 0 rss: 68Mb L: 23/36 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:00.963 [2024-11-18 19:06:19.320963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.320989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.321117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.321135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.963 #25 NEW cov: 11796 ft: 13818 corp: 13/283b lim: 40 exec/s: 0 rss: 68Mb L: 22/36 MS: 1 ChangeBinInt- 00:08:00.963 [2024-11-18 19:06:19.361148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.361174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.361296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.361314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.963 #26 NEW cov: 11796 ft: 13834 corp: 14/302b lim: 40 exec/s: 0 rss: 68Mb L: 19/36 MS: 1 ShuffleBytes- 00:08:00.963 [2024-11-18 19:06:19.401318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.401345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.401470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00002922 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.401486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.963 #32 NEW cov: 11796 ft: 13892 corp: 15/324b lim: 40 exec/s: 0 rss: 68Mb L: 22/36 MS: 1 ChangeByte- 00:08:00.963 [2024-11-18 19:06:19.452032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.452060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.452197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.452217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.452334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:222222ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.452351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.452482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.452500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.963 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.963 #33 NEW cov: 11819 ft: 13922 corp: 16/360b lim: 40 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 CopyPart- 00:08:00.963 [2024-11-18 19:06:19.512256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.512285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.512408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:222222fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.512427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.512554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fefefefe cdw11:fefefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.512575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.963 [2024-11-18 19:06:19.512705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fefefe22 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.963 [2024-11-18 19:06:19.512722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.963 #34 NEW cov: 11819 ft: 13998 corp: 17/394b lim: 40 exec/s: 0 rss: 69Mb L: 34/36 MS: 1 InsertRepeatedBytes- 00:08:00.964 [2024-11-18 19:06:19.551652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.964 [2024-11-18 19:06:19.551679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.964 [2024-11-18 19:06:19.551807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fe002922 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.964 [2024-11-18 19:06:19.551825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.223 #35 NEW cov: 11819 ft: 14008 corp: 18/416b lim: 40 exec/s: 35 rss: 69Mb L: 22/36 MS: 1 ChangeBinInt- 00:08:01.223 [2024-11-18 19:06:19.602320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.602349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.602488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.602506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.602639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:46464622 cdw11:22fefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.602657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.602786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fefefefe cdw11:fefefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.602804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.602938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:fe222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.602956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:01.223 #36 NEW cov: 11819 ft: 14109 corp: 19/456b lim: 40 exec/s: 36 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:01.223 [2024-11-18 19:06:19.652092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.652121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.652250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.652267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.223 #37 NEW cov: 11819 ft: 14119 corp: 20/475b lim: 40 exec/s: 37 rss: 69Mb L: 19/40 MS: 1 ShuffleBytes- 00:08:01.223 [2024-11-18 19:06:19.692696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.692724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.692858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:222222fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.692876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.693000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fefefefe cdw11:fefefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.693019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.693116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fefefe22 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.693132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.223 #38 NEW cov: 11819 ft: 14149 corp: 21/509b lim: 40 exec/s: 38 rss: 69Mb L: 34/40 MS: 1 ShuffleBytes- 00:08:01.223 [2024-11-18 19:06:19.732511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0822 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.732539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.732677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.732696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.732819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.732836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.223 #39 NEW cov: 11819 ft: 14357 corp: 22/540b lim: 40 exec/s: 39 rss: 69Mb L: 31/40 MS: 1 CopyPart- 00:08:01.223 [2024-11-18 19:06:19.782365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0822 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.782395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.782518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2222a222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.782537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.223 #40 NEW cov: 11819 ft: 14362 corp: 23/561b lim: 40 exec/s: 40 rss: 69Mb L: 21/40 MS: 1 ChangeBit- 00:08:01.223 [2024-11-18 19:06:19.822195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.822223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.223 [2024-11-18 19:06:19.822354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:faff2122 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.223 [2024-11-18 19:06:19.822373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.483 #41 NEW cov: 11819 ft: 14374 corp: 24/583b lim: 40 exec/s: 41 rss: 69Mb L: 22/40 MS: 1 ChangeBinInt- 00:08:01.483 [2024-11-18 19:06:19.862159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f20e0707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.862188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:19.862323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07073a07 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.862341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.483 #42 NEW cov: 11819 ft: 14392 corp: 25/604b lim: 40 exec/s: 42 rss: 69Mb L: 21/40 MS: 1 CopyPart- 00:08:01.483 [2024-11-18 19:06:19.913326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.913355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:19.913493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:222222fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.913510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:19.913633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:65656565 cdw11:fefefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.913652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:19.913780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fefefefe cdw11:fefefe22 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.913798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.483 #43 NEW cov: 11819 ft: 14410 corp: 26/642b lim: 40 exec/s: 43 rss: 69Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:01.483 [2024-11-18 19:06:19.953386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0822 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.953415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:19.953555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.953573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:19.953702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:22222222 cdw11:20222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.953730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:19.953866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:19.953883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.483 #44 NEW cov: 11819 ft: 14426 corp: 27/674b lim: 40 exec/s: 44 rss: 69Mb L: 32/40 MS: 1 ChangeBinInt- 00:08:01.483 [2024-11-18 19:06:20.003027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a430a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:20.003055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:20.003184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:20.003203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.483 #45 NEW cov: 11819 ft: 14465 corp: 28/696b lim: 40 exec/s: 45 rss: 69Mb L: 22/40 MS: 1 InsertByte- 00:08:01.483 [2024-11-18 19:06:20.053214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:20.053245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.483 [2024-11-18 19:06:20.053374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:00162222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.483 [2024-11-18 19:06:20.053391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.483 #46 NEW cov: 11819 ft: 14472 corp: 29/718b lim: 40 exec/s: 46 rss: 69Mb L: 22/40 MS: 1 ChangeBinInt- 00:08:01.743 [2024-11-18 19:06:20.103992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.104024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.104161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:222222fe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.104179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.104310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fefefefe cdw11:fefefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.104328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.104457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:fefefe22 cdw11:54222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.104474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.743 #47 NEW cov: 11819 ft: 14549 corp: 30/752b lim: 40 exec/s: 47 rss: 69Mb L: 34/40 MS: 1 ChangeByte- 00:08:01.743 [2024-11-18 19:06:20.163807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.163835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.163989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.164008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.164135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:22222222 cdw11:2222e1e1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.164152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.743 #48 NEW cov: 11819 ft: 14562 corp: 31/779b lim: 40 exec/s: 48 rss: 69Mb L: 27/40 MS: 1 InsertRepeatedBytes- 00:08:01.743 [2024-11-18 19:06:20.203700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0822 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.203727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.203854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.203871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.743 #49 NEW cov: 11819 ft: 14575 corp: 32/796b lim: 40 exec/s: 49 rss: 69Mb L: 17/40 MS: 1 EraseBytes- 00:08:01.743 [2024-11-18 19:06:20.253805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00002222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.253834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.253972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.253989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.743 #50 NEW cov: 11819 ft: 14593 corp: 33/818b lim: 40 exec/s: 50 rss: 69Mb L: 22/40 MS: 1 EraseBytes- 00:08:01.743 [2024-11-18 19:06:20.303999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:46464646 cdw11:46464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.304027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.743 [2024-11-18 19:06:20.304157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:46464646 cdw11:46464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.743 [2024-11-18 19:06:20.304177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.743 #51 NEW cov: 11819 ft: 14601 corp: 34/841b lim: 40 exec/s: 51 rss: 69Mb L: 23/40 MS: 1 CopyPart- 00:08:02.003 [2024-11-18 19:06:20.365115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a08 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.365143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.365274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22222222 cdw11:22464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.365295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.365437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:46462222 cdw11:46464646 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.365455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.365588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:46462222 cdw11:fefefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.365607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.365740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:fefefefe cdw11:fefefefe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.365756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.003 #52 NEW cov: 11819 ft: 14667 corp: 35/881b lim: 40 exec/s: 52 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:08:02.003 [2024-11-18 19:06:20.414762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:16000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.414789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.414920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0000faff cdw11:21222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.414940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.415019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:16000000 cdw11:0000faff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.415038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.415189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:21222222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.415206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.003 #53 NEW cov: 11819 ft: 14675 corp: 36/917b lim: 40 exec/s: 53 rss: 70Mb L: 36/40 MS: 1 CopyPart- 00:08:02.003 [2024-11-18 19:06:20.474455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0e070707 cdw11:0707bf07 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.474483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.003 [2024-11-18 19:06:20.474618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:07070707 cdw11:07070707 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.003 [2024-11-18 19:06:20.474638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.003 #54 NEW cov: 11819 ft: 14689 corp: 37/937b lim: 40 exec/s: 54 rss: 70Mb L: 20/40 MS: 1 InsertByte- 00:08:02.003 [2024-11-18 19:06:20.524628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0a1600 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.004 [2024-11-18 19:06:20.524655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.004 [2024-11-18 19:06:20.524790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00fe2222 cdw11:22222222 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.004 [2024-11-18 19:06:20.524821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.004 #55 NEW cov: 11819 ft: 14697 corp: 38/959b lim: 40 exec/s: 27 rss: 70Mb L: 22/40 MS: 1 ChangeBinInt- 00:08:02.004 #55 DONE cov: 11819 ft: 14697 corp: 38/959b lim: 40 exec/s: 27 rss: 70Mb 00:08:02.004 Done 55 runs in 2 second(s) 00:08:02.263 19:06:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:02.263 19:06:20 -- ../common.sh@72 -- # (( i++ )) 00:08:02.263 19:06:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.263 19:06:20 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:02.263 19:06:20 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:02.263 19:06:20 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.263 19:06:20 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.263 19:06:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:02.263 19:06:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:02.263 19:06:20 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:02.263 19:06:20 -- nvmf/run.sh@29 -- # port=4412 00:08:02.263 19:06:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:02.263 19:06:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:02.263 19:06:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.263 19:06:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:02.263 [2024-11-18 19:06:20.711346] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:02.263 [2024-11-18 19:06:20.711430] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1304534 ] 00:08:02.263 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.523 [2024-11-18 19:06:20.891009] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.523 [2024-11-18 19:06:20.954784] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.523 [2024-11-18 19:06:20.954918] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.523 [2024-11-18 19:06:21.013610] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.523 [2024-11-18 19:06:21.029876] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:02.523 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.523 INFO: Seed: 3957058624 00:08:02.523 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:02.523 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:02.523 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:02.523 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.523 #2 INITED exec/s: 0 rss: 60Mb 00:08:02.523 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.523 This may also happen if the target rejected all inputs we tried so far 00:08:02.523 [2024-11-18 19:06:21.096544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-18 19:06:21.096584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.523 [2024-11-18 19:06:21.096718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-18 19:06:21.096737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.523 [2024-11-18 19:06:21.096850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.523 [2024-11-18 19:06:21.096869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.042 NEW_FUNC[1/671]: 0x44b168 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:03.042 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.042 #7 NEW cov: 11590 ft: 11588 corp: 2/25b lim: 40 exec/s: 0 rss: 68Mb L: 24/24 MS: 5 ChangeBit-ShuffleBytes-CopyPart-EraseBytes-InsertRepeatedBytes- 00:08:03.042 [2024-11-18 19:06:21.427679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:23080000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.042 [2024-11-18 19:06:21.427729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.042 [2024-11-18 19:06:21.427876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.042 [2024-11-18 19:06:21.427900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.042 [2024-11-18 19:06:21.428045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.042 [2024-11-18 19:06:21.428069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.042 #10 NEW cov: 11703 ft: 12105 corp: 3/55b lim: 40 exec/s: 0 rss: 68Mb L: 30/30 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:03.042 [2024-11-18 19:06:21.477412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.042 [2024-11-18 19:06:21.477443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.042 [2024-11-18 19:06:21.477583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.042 [2024-11-18 19:06:21.477602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.042 #12 NEW cov: 11709 ft: 12471 corp: 4/71b lim: 40 exec/s: 0 rss: 68Mb L: 16/30 MS: 2 CrossOver-CrossOver- 00:08:03.042 [2024-11-18 19:06:21.527511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.042 [2024-11-18 19:06:21.527540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.042 [2024-11-18 19:06:21.527673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.042 [2024-11-18 19:06:21.527692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.042 #13 NEW cov: 11794 ft: 12814 corp: 5/87b lim: 40 exec/s: 0 rss: 68Mb L: 16/30 MS: 1 CopyPart- 00:08:03.042 [2024-11-18 19:06:21.587615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.043 [2024-11-18 19:06:21.587644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.043 [2024-11-18 19:06:21.587773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00001000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.043 [2024-11-18 19:06:21.587793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.043 #14 NEW cov: 11794 ft: 12961 corp: 6/103b lim: 40 exec/s: 0 rss: 68Mb L: 16/30 MS: 1 ChangeBinInt- 00:08:03.043 [2024-11-18 19:06:21.637563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00001000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.043 [2024-11-18 19:06:21.637592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.302 #15 NEW cov: 11794 ft: 13722 corp: 7/111b lim: 40 exec/s: 0 rss: 68Mb L: 8/30 MS: 1 EraseBytes- 00:08:03.302 [2024-11-18 19:06:21.687228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.302 [2024-11-18 19:06:21.687257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.302 #19 NEW cov: 11794 ft: 13823 corp: 8/124b lim: 40 exec/s: 0 rss: 68Mb L: 13/30 MS: 4 CopyPart-ShuffleBytes-ChangeBinInt-CrossOver- 00:08:03.302 [2024-11-18 19:06:21.737892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.302 [2024-11-18 19:06:21.737920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.302 #25 NEW cov: 11794 ft: 13938 corp: 9/134b lim: 40 exec/s: 0 rss: 68Mb L: 10/30 MS: 1 EraseBytes- 00:08:03.302 [2024-11-18 19:06:21.788235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.302 [2024-11-18 19:06:21.788263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.302 [2024-11-18 19:06:21.788382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00001000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.302 [2024-11-18 19:06:21.788398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.302 #26 NEW cov: 11794 ft: 13964 corp: 10/150b lim: 40 exec/s: 0 rss: 68Mb L: 16/30 MS: 1 ChangeByte- 00:08:03.302 [2024-11-18 19:06:21.838152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000010 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.302 [2024-11-18 19:06:21.838179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.302 #27 NEW cov: 11794 ft: 14005 corp: 11/159b lim: 40 exec/s: 0 rss: 68Mb L: 9/30 MS: 1 EraseBytes- 00:08:03.302 [2024-11-18 19:06:21.888572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.302 [2024-11-18 19:06:21.888600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.302 [2024-11-18 19:06:21.888728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.302 [2024-11-18 19:06:21.888745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.562 #32 NEW cov: 11794 ft: 14082 corp: 12/179b lim: 40 exec/s: 0 rss: 68Mb L: 20/30 MS: 5 CrossOver-CopyPart-CrossOver-ChangeBinInt-CrossOver- 00:08:03.562 [2024-11-18 19:06:21.939262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:21.939290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.562 [2024-11-18 19:06:21.939424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00001000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:21.939442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.562 [2024-11-18 19:06:21.939606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:21.939623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.562 [2024-11-18 19:06:21.939757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:21.939775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.562 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.562 #33 NEW cov: 11817 ft: 14482 corp: 13/211b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 CrossOver- 00:08:03.562 [2024-11-18 19:06:21.999601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:21.999631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.562 [2024-11-18 19:06:21.999758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00001000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:21.999775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.562 [2024-11-18 19:06:21.999902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:21.999919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.562 [2024-11-18 19:06:22.000040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:22.000058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.562 #34 NEW cov: 11817 ft: 14497 corp: 14/243b lim: 40 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 CrossOver- 00:08:03.562 [2024-11-18 19:06:22.058787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000010 cdw11:00f9ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:22.058817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.562 #35 NEW cov: 11817 ft: 14519 corp: 15/252b lim: 40 exec/s: 35 rss: 69Mb L: 9/32 MS: 1 ChangeBinInt- 00:08:03.562 [2024-11-18 19:06:22.109049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:22.109077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.562 #36 NEW cov: 11817 ft: 14562 corp: 16/267b lim: 40 exec/s: 36 rss: 69Mb L: 15/32 MS: 1 CrossOver- 00:08:03.562 [2024-11-18 19:06:22.159200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.562 [2024-11-18 19:06:22.159228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.822 #37 NEW cov: 11817 ft: 14594 corp: 17/276b lim: 40 exec/s: 37 rss: 69Mb L: 9/32 MS: 1 ChangeBinInt- 00:08:03.822 [2024-11-18 19:06:22.209621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.209650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.822 [2024-11-18 19:06:22.209774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.209792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.822 #38 NEW cov: 11817 ft: 14619 corp: 18/292b lim: 40 exec/s: 38 rss: 69Mb L: 16/32 MS: 1 ChangeBit- 00:08:03.822 [2024-11-18 19:06:22.259492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00a40000 cdw11:00100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.259520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.822 #39 NEW cov: 11817 ft: 14664 corp: 19/303b lim: 40 exec/s: 39 rss: 69Mb L: 11/32 MS: 1 InsertByte- 00:08:03.822 [2024-11-18 19:06:22.309893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.309921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.822 [2024-11-18 19:06:22.310054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000010 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.310072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.822 #40 NEW cov: 11817 ft: 14669 corp: 20/320b lim: 40 exec/s: 40 rss: 69Mb L: 17/32 MS: 1 CrossOver- 00:08:03.822 [2024-11-18 19:06:22.359943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.359970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.822 #41 NEW cov: 11817 ft: 14686 corp: 21/333b lim: 40 exec/s: 41 rss: 69Mb L: 13/32 MS: 1 EraseBytes- 00:08:03.822 [2024-11-18 19:06:22.410905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.410933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.822 [2024-11-18 19:06:22.411056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00001000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.411078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.822 [2024-11-18 19:06:22.411209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.411226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.822 [2024-11-18 19:06:22.411353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.822 [2024-11-18 19:06:22.411369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.082 #47 NEW cov: 11817 ft: 14755 corp: 22/365b lim: 40 exec/s: 47 rss: 69Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:04.082 [2024-11-18 19:06:22.461060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.461094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.082 [2024-11-18 19:06:22.461225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00411000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.461244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.082 [2024-11-18 19:06:22.461387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.461406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.082 [2024-11-18 19:06:22.461486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.461503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.082 #48 NEW cov: 11817 ft: 14792 corp: 23/397b lim: 40 exec/s: 48 rss: 69Mb L: 32/32 MS: 1 ChangeByte- 00:08:04.082 [2024-11-18 19:06:22.520499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000010 cdw11:f6ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.520529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.082 #54 NEW cov: 11817 ft: 14806 corp: 24/406b lim: 40 exec/s: 54 rss: 69Mb L: 9/32 MS: 1 ChangeBinInt- 00:08:04.082 [2024-11-18 19:06:22.570883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.570912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.082 [2024-11-18 19:06:22.571036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.571054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.082 #55 NEW cov: 11817 ft: 14807 corp: 25/422b lim: 40 exec/s: 55 rss: 69Mb L: 16/32 MS: 1 ShuffleBytes- 00:08:04.082 [2024-11-18 19:06:22.620699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2e001000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.620729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.082 #56 NEW cov: 11817 ft: 14821 corp: 26/430b lim: 40 exec/s: 56 rss: 69Mb L: 8/32 MS: 1 ChangeByte- 00:08:04.082 [2024-11-18 19:06:22.681263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.681292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.082 [2024-11-18 19:06:22.681423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:01000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.082 [2024-11-18 19:06:22.681440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.341 #57 NEW cov: 11817 ft: 14831 corp: 27/446b lim: 40 exec/s: 57 rss: 69Mb L: 16/32 MS: 1 ChangeBit- 00:08:04.341 [2024-11-18 19:06:22.741161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00001000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.741197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.341 #58 NEW cov: 11817 ft: 14848 corp: 28/457b lim: 40 exec/s: 58 rss: 69Mb L: 11/32 MS: 1 EraseBytes- 00:08:04.341 [2024-11-18 19:06:22.802190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.802220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.341 [2024-11-18 19:06:22.802350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.802368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.341 [2024-11-18 19:06:22.802497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:10100000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.802514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.341 [2024-11-18 19:06:22.802644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.802665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.341 #59 NEW cov: 11817 ft: 14853 corp: 29/492b lim: 40 exec/s: 59 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:08:04.341 [2024-11-18 19:06:22.852369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.852399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.341 [2024-11-18 19:06:22.852530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00001000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.852555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.341 [2024-11-18 19:06:22.852699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.341 [2024-11-18 19:06:22.852717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.342 [2024-11-18 19:06:22.852852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.342 [2024-11-18 19:06:22.852870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.342 #60 NEW cov: 11817 ft: 14869 corp: 30/527b lim: 40 exec/s: 60 rss: 69Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:04.342 [2024-11-18 19:06:22.902522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.342 [2024-11-18 19:06:22.902556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.342 [2024-11-18 19:06:22.902691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:000000b3 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.342 [2024-11-18 19:06:22.902712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.342 [2024-11-18 19:06:22.902844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:10100000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.342 [2024-11-18 19:06:22.902866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.342 [2024-11-18 19:06:22.903008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.342 [2024-11-18 19:06:22.903028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.342 #61 NEW cov: 11817 ft: 14890 corp: 31/562b lim: 40 exec/s: 61 rss: 69Mb L: 35/35 MS: 1 ChangeByte- 00:08:04.601 [2024-11-18 19:06:22.962412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:22.962441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.601 [2024-11-18 19:06:22.962583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00080000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:22.962603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.601 [2024-11-18 19:06:22.962746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:22.962764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.601 #62 NEW cov: 11817 ft: 14919 corp: 32/586b lim: 40 exec/s: 62 rss: 69Mb L: 24/35 MS: 1 ChangeBit- 00:08:04.601 [2024-11-18 19:06:23.022868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00030000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:23.022897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.601 [2024-11-18 19:06:23.023029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00001000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:23.023048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.601 [2024-11-18 19:06:23.023188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:23.023207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.601 [2024-11-18 19:06:23.023332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:23.023350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.601 #63 NEW cov: 11817 ft: 14924 corp: 33/618b lim: 40 exec/s: 63 rss: 70Mb L: 32/35 MS: 1 CrossOver- 00:08:04.601 [2024-11-18 19:06:23.082218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000010 cdw11:00f9fbff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.601 [2024-11-18 19:06:23.082247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.601 #64 pulse cov: 11817 ft: 14925 corp: 33/618b lim: 40 exec/s: 32 rss: 70Mb 00:08:04.601 #64 NEW cov: 11817 ft: 14925 corp: 34/627b lim: 40 exec/s: 32 rss: 70Mb L: 9/35 MS: 1 ChangeBit- 00:08:04.601 #64 DONE cov: 11817 ft: 14925 corp: 34/627b lim: 40 exec/s: 32 rss: 70Mb 00:08:04.601 Done 64 runs in 2 second(s) 00:08:04.861 19:06:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:04.861 19:06:23 -- ../common.sh@72 -- # (( i++ )) 00:08:04.861 19:06:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.861 19:06:23 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:04.861 19:06:23 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:04.861 19:06:23 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.861 19:06:23 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.861 19:06:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:04.861 19:06:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:04.861 19:06:23 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:04.861 19:06:23 -- nvmf/run.sh@29 -- # port=4413 00:08:04.861 19:06:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:04.861 19:06:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:04.861 19:06:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.861 19:06:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:04.861 [2024-11-18 19:06:23.269280] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:04.861 [2024-11-18 19:06:23.269364] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1305077 ] 00:08:04.861 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.861 [2024-11-18 19:06:23.446673] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.120 [2024-11-18 19:06:23.513725] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.120 [2024-11-18 19:06:23.513850] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.120 [2024-11-18 19:06:23.572326] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.120 [2024-11-18 19:06:23.588626] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:05.120 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.120 INFO: Seed: 2222089407 00:08:05.120 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:05.120 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:05.120 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.120 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.120 #2 INITED exec/s: 0 rss: 60Mb 00:08:05.120 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.120 This may also happen if the target rejected all inputs we tried so far 00:08:05.120 [2024-11-18 19:06:23.643890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.120 [2024-11-18 19:06:23.643918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.380 NEW_FUNC[1/670]: 0x44cd38 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:05.380 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.381 #10 NEW cov: 11578 ft: 11579 corp: 2/9b lim: 40 exec/s: 0 rss: 68Mb L: 8/8 MS: 3 ChangeByte-InsertRepeatedBytes-CopyPart- 00:08:05.381 [2024-11-18 19:06:23.944510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f5ffffff cdw11:ffffffed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.381 [2024-11-18 19:06:23.944539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.381 #11 NEW cov: 11691 ft: 12122 corp: 3/17b lim: 40 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:05.643 [2024-11-18 19:06:23.984952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.643 [2024-11-18 19:06:23.984982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.643 [2024-11-18 19:06:23.985040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.643 [2024-11-18 19:06:23.985054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.643 [2024-11-18 19:06:23.985108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.643 [2024-11-18 19:06:23.985121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.643 [2024-11-18 19:06:23.985174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.643 [2024-11-18 19:06:23.985187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.643 #12 NEW cov: 11697 ft: 12974 corp: 4/51b lim: 40 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:05.643 [2024-11-18 19:06:24.024685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.643 [2024-11-18 19:06:24.024712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.643 #18 NEW cov: 11782 ft: 13245 corp: 5/66b lim: 40 exec/s: 0 rss: 69Mb L: 15/34 MS: 1 InsertRepeatedBytes- 00:08:05.643 [2024-11-18 19:06:24.064816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.643 [2024-11-18 19:06:24.064843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.643 #19 NEW cov: 11782 ft: 13315 corp: 6/81b lim: 40 exec/s: 0 rss: 69Mb L: 15/34 MS: 1 ChangeBit- 00:08:05.643 [2024-11-18 19:06:24.104896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.643 [2024-11-18 19:06:24.104921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.643 #20 NEW cov: 11782 ft: 13435 corp: 7/96b lim: 40 exec/s: 0 rss: 69Mb L: 15/34 MS: 1 ChangeByte- 00:08:05.644 [2024-11-18 19:06:24.145039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.644 [2024-11-18 19:06:24.145065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.644 #21 NEW cov: 11782 ft: 13503 corp: 8/104b lim: 40 exec/s: 0 rss: 69Mb L: 8/34 MS: 1 ShuffleBytes- 00:08:05.644 [2024-11-18 19:06:24.185099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d237df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.644 [2024-11-18 19:06:24.185124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.644 #22 NEW cov: 11782 ft: 13627 corp: 9/119b lim: 40 exec/s: 0 rss: 69Mb L: 15/34 MS: 1 ChangeByte- 00:08:05.644 [2024-11-18 19:06:24.215325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.644 [2024-11-18 19:06:24.215349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.644 [2024-11-18 19:06:24.215410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.644 [2024-11-18 19:06:24.215431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.644 #23 NEW cov: 11782 ft: 13885 corp: 10/135b lim: 40 exec/s: 0 rss: 69Mb L: 16/34 MS: 1 CopyPart- 00:08:05.933 [2024-11-18 19:06:24.255346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.255372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.933 #24 NEW cov: 11782 ft: 14013 corp: 11/150b lim: 40 exec/s: 0 rss: 69Mb L: 15/34 MS: 1 ShuffleBytes- 00:08:05.933 [2024-11-18 19:06:24.295695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d626262 cdw11:62626262 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.295721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.295775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:62626262 cdw11:7d7d7d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.295789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.295843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7d7df5ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.295856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.933 #25 NEW cov: 11782 ft: 14199 corp: 12/176b lim: 40 exec/s: 0 rss: 69Mb L: 26/34 MS: 1 InsertRepeatedBytes- 00:08:05.933 [2024-11-18 19:06:24.335945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.335970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.336026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00050000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.336040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.336093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.336106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.336160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.336173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.933 #26 NEW cov: 11782 ft: 14217 corp: 13/210b lim: 40 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:05.933 [2024-11-18 19:06:24.375820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.375845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.375901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:feff7d7d cdw11:7dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.375918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.933 #27 NEW cov: 11782 ft: 14243 corp: 14/228b lim: 40 exec/s: 0 rss: 69Mb L: 18/34 MS: 1 CrossOver- 00:08:05.933 [2024-11-18 19:06:24.416034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7ddcdc cdw11:dcdcdcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.416059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.416131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdc7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.416145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.416197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7d7d7d7d cdw11:f5feff7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.416210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.933 #28 NEW cov: 11782 ft: 14355 corp: 15/259b lim: 40 exec/s: 0 rss: 69Mb L: 31/34 MS: 1 InsertRepeatedBytes- 00:08:05.933 [2024-11-18 19:06:24.456190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.456214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.456285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.456300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.456354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2f2f2fff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.456367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.933 #29 NEW cov: 11782 ft: 14379 corp: 16/284b lim: 40 exec/s: 0 rss: 69Mb L: 25/34 MS: 1 InsertRepeatedBytes- 00:08:05.933 [2024-11-18 19:06:24.496293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d626262 cdw11:9e9d9d9d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.496318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.496390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:9d9d9da3 cdw11:7d7d7d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.496404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.933 [2024-11-18 19:06:24.496459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7d7df5ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.933 [2024-11-18 19:06:24.496472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.933 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:05.933 #30 NEW cov: 11805 ft: 14400 corp: 17/310b lim: 40 exec/s: 0 rss: 70Mb L: 26/34 MS: 1 ChangeBinInt- 00:08:06.198 [2024-11-18 19:06:24.536309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:ff7df57d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.536338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.198 [2024-11-18 19:06:24.536393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7dffffff cdw11:ffffffed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.536407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.198 #31 NEW cov: 11805 ft: 14435 corp: 18/326b lim: 40 exec/s: 0 rss: 70Mb L: 16/34 MS: 1 ShuffleBytes- 00:08:06.198 [2024-11-18 19:06:24.576291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f5fff5ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.576316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.198 #32 NEW cov: 11805 ft: 14456 corp: 19/339b lim: 40 exec/s: 0 rss: 70Mb L: 13/34 MS: 1 CopyPart- 00:08:06.198 [2024-11-18 19:06:24.616414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:ffffff2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.616439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.198 #33 NEW cov: 11805 ft: 14501 corp: 20/347b lim: 40 exec/s: 33 rss: 70Mb L: 8/34 MS: 1 EraseBytes- 00:08:06.198 [2024-11-18 19:06:24.656937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.656962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.198 [2024-11-18 19:06:24.657035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00050000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.657050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.198 [2024-11-18 19:06:24.657105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.657118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.198 [2024-11-18 19:06:24.657173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.657189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.198 #34 NEW cov: 11805 ft: 14512 corp: 21/381b lim: 40 exec/s: 34 rss: 70Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:06.198 [2024-11-18 19:06:24.696632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.696659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.198 #35 NEW cov: 11805 ft: 14530 corp: 22/396b lim: 40 exec/s: 35 rss: 70Mb L: 15/34 MS: 1 CrossOver- 00:08:06.198 [2024-11-18 19:06:24.736779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.736804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.198 #36 NEW cov: 11805 ft: 14543 corp: 23/408b lim: 40 exec/s: 36 rss: 70Mb L: 12/34 MS: 1 EraseBytes- 00:08:06.198 [2024-11-18 19:06:24.777031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:317d7d7d cdw11:7d7d7d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.777059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.198 [2024-11-18 19:06:24.777131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f5ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.198 [2024-11-18 19:06:24.777145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.198 #37 NEW cov: 11805 ft: 14545 corp: 24/425b lim: 40 exec/s: 37 rss: 70Mb L: 17/34 MS: 1 InsertByte- 00:08:06.463 [2024-11-18 19:06:24.807231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.463 [2024-11-18 19:06:24.807256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.463 [2024-11-18 19:06:24.807312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffff2f2f cdw11:2f2f2f2f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.463 [2024-11-18 19:06:24.807326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.463 [2024-11-18 19:06:24.807380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:2f2f2fff cdw11:ffff2fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.463 [2024-11-18 19:06:24.807393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.463 #38 NEW cov: 11805 ft: 14585 corp: 25/450b lim: 40 exec/s: 38 rss: 70Mb L: 25/34 MS: 1 CopyPart- 00:08:06.463 [2024-11-18 19:06:24.847251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:ff7df57d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.463 [2024-11-18 19:06:24.847275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.463 [2024-11-18 19:06:24.847348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:83000000 cdw11:ffffffed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.463 [2024-11-18 19:06:24.847361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.463 #39 NEW cov: 11805 ft: 14592 corp: 26/466b lim: 40 exec/s: 39 rss: 70Mb L: 16/34 MS: 1 ChangeBinInt- 00:08:06.463 [2024-11-18 19:06:24.887380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:317d7d92 cdw11:7d7d7d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.463 [2024-11-18 19:06:24.887405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.463 [2024-11-18 19:06:24.887461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f5ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.463 [2024-11-18 19:06:24.887475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.464 #40 NEW cov: 11805 ft: 14605 corp: 27/483b lim: 40 exec/s: 40 rss: 70Mb L: 17/34 MS: 1 ChangeByte- 00:08:06.464 [2024-11-18 19:06:24.927381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7dfdf5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.464 [2024-11-18 19:06:24.927406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.464 #41 NEW cov: 11805 ft: 14612 corp: 28/498b lim: 40 exec/s: 41 rss: 70Mb L: 15/34 MS: 1 ChangeBit- 00:08:06.464 [2024-11-18 19:06:24.957536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7daa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.464 [2024-11-18 19:06:24.957565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.464 [2024-11-18 19:06:24.957637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.464 [2024-11-18 19:06:24.957651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.464 #42 NEW cov: 11805 ft: 14682 corp: 29/520b lim: 40 exec/s: 42 rss: 70Mb L: 22/34 MS: 1 InsertRepeatedBytes- 00:08:06.464 [2024-11-18 19:06:24.997558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f5ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.464 [2024-11-18 19:06:24.997582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.464 #43 NEW cov: 11805 ft: 14689 corp: 30/534b lim: 40 exec/s: 43 rss: 70Mb L: 14/34 MS: 1 CrossOver- 00:08:06.464 [2024-11-18 19:06:25.027670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:317d7d7d cdw11:7d7d7d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.464 [2024-11-18 19:06:25.027694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.464 [2024-11-18 19:06:25.027748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f5ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.464 [2024-11-18 19:06:25.027769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.464 #44 NEW cov: 11805 ft: 14720 corp: 31/551b lim: 40 exec/s: 44 rss: 70Mb L: 17/34 MS: 1 CrossOver- 00:08:06.722 [2024-11-18 19:06:25.067716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:f5ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.722 [2024-11-18 19:06:25.067742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.722 #45 NEW cov: 11805 ft: 14733 corp: 32/565b lim: 40 exec/s: 45 rss: 70Mb L: 14/34 MS: 1 ShuffleBytes- 00:08:06.722 [2024-11-18 19:06:25.107813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.722 [2024-11-18 19:06:25.107837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.722 #46 NEW cov: 11805 ft: 14771 corp: 33/573b lim: 40 exec/s: 46 rss: 70Mb L: 8/34 MS: 1 ChangeByte- 00:08:06.722 [2024-11-18 19:06:25.148044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.722 [2024-11-18 19:06:25.148068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.148139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffff5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.148153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.178134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:4affffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.178159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.178232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffff5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.178246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.723 #48 NEW cov: 11805 ft: 14781 corp: 34/595b lim: 40 exec/s: 48 rss: 70Mb L: 22/34 MS: 2 InsertRepeatedBytes-ChangeByte- 00:08:06.723 [2024-11-18 19:06:25.208215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7a7d7d cdw11:7d7d237d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.208239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.208312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f5ffffff cdw11:ffffffed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.208325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.723 #49 NEW cov: 11805 ft: 14791 corp: 35/611b lim: 40 exec/s: 49 rss: 70Mb L: 16/34 MS: 1 InsertByte- 00:08:06.723 [2024-11-18 19:06:25.248458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.248482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.248558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.248572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.248627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.248640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.723 #50 NEW cov: 11805 ft: 14806 corp: 36/636b lim: 40 exec/s: 50 rss: 70Mb L: 25/34 MS: 1 InsertRepeatedBytes- 00:08:06.723 [2024-11-18 19:06:25.288563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:7d7d7df5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.288587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.288661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.288674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.723 [2024-11-18 19:06:25.288729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bfffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.723 [2024-11-18 19:06:25.288743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.723 #51 NEW cov: 11805 ft: 14815 corp: 37/661b lim: 40 exec/s: 51 rss: 70Mb L: 25/34 MS: 1 ChangeBit- 00:08:06.982 [2024-11-18 19:06:25.328755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d626262 cdw11:9e9d9d9d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.328782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.982 [2024-11-18 19:06:25.328840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a39d9d9d cdw11:7d7d7d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.328853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.982 [2024-11-18 19:06:25.328909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7d7df5ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.328926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.982 #52 NEW cov: 11805 ft: 14823 corp: 38/687b lim: 40 exec/s: 52 rss: 70Mb L: 26/34 MS: 1 ShuffleBytes- 00:08:06.982 [2024-11-18 19:06:25.368716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.368740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.982 [2024-11-18 19:06:25.368808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000057d cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.368822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.982 #53 NEW cov: 11805 ft: 14828 corp: 39/709b lim: 40 exec/s: 53 rss: 70Mb L: 22/34 MS: 1 CrossOver- 00:08:06.982 [2024-11-18 19:06:25.408710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.408735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.982 #54 NEW cov: 11805 ft: 14842 corp: 40/724b lim: 40 exec/s: 54 rss: 70Mb L: 15/34 MS: 1 ShuffleBytes- 00:08:06.982 [2024-11-18 19:06:25.438916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.438941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.982 [2024-11-18 19:06:25.439011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00055d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.439024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.982 #55 NEW cov: 11805 ft: 14859 corp: 41/740b lim: 40 exec/s: 55 rss: 70Mb L: 16/34 MS: 1 InsertByte- 00:08:06.982 [2024-11-18 19:06:25.479128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.479152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.982 [2024-11-18 19:06:25.479222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000005 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.479235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.982 [2024-11-18 19:06:25.479287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7daaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.982 [2024-11-18 19:06:25.479299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.982 [2024-11-18 19:06:25.519325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.983 [2024-11-18 19:06:25.519349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.983 [2024-11-18 19:06:25.519421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000005 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.983 [2024-11-18 19:06:25.519436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.983 [2024-11-18 19:06:25.519494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7daaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.983 [2024-11-18 19:06:25.519508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.983 #57 NEW cov: 11805 ft: 14967 corp: 42/767b lim: 40 exec/s: 57 rss: 70Mb L: 27/34 MS: 2 InsertRepeatedBytes-ChangeBit- 00:08:06.983 [2024-11-18 19:06:25.559146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff2f cdw11:7d7d7d7d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.983 [2024-11-18 19:06:25.559171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.983 #58 NEW cov: 11805 ft: 14968 corp: 43/779b lim: 40 exec/s: 58 rss: 70Mb L: 12/34 MS: 1 CopyPart- 00:08:07.242 [2024-11-18 19:06:25.599412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7d7d7d7d cdw11:42424242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.242 [2024-11-18 19:06:25.599437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.242 [2024-11-18 19:06:25.599505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:424242ff cdw11:7df57d83 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.242 [2024-11-18 19:06:25.599519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.242 #59 NEW cov: 11805 ft: 14982 corp: 44/802b lim: 40 exec/s: 29 rss: 70Mb L: 23/34 MS: 1 InsertRepeatedBytes- 00:08:07.242 #59 DONE cov: 11805 ft: 14982 corp: 44/802b lim: 40 exec/s: 29 rss: 70Mb 00:08:07.242 Done 59 runs in 2 second(s) 00:08:07.242 19:06:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:07.242 19:06:25 -- ../common.sh@72 -- # (( i++ )) 00:08:07.242 19:06:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.242 19:06:25 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:07.242 19:06:25 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:07.242 19:06:25 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.242 19:06:25 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.242 19:06:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:07.242 19:06:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:07.242 19:06:25 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:07.242 19:06:25 -- nvmf/run.sh@29 -- # port=4414 00:08:07.242 19:06:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:07.242 19:06:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:07.242 19:06:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.242 19:06:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:07.242 [2024-11-18 19:06:25.787973] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:07.242 [2024-11-18 19:06:25.788055] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1305482 ] 00:08:07.242 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.501 [2024-11-18 19:06:25.972866] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.501 [2024-11-18 19:06:26.037513] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.501 [2024-11-18 19:06:26.037641] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.501 [2024-11-18 19:06:26.095739] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.761 [2024-11-18 19:06:26.112057] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:07.761 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.761 INFO: Seed: 449139637 00:08:07.761 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:07.761 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:07.761 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:07.761 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.761 #2 INITED exec/s: 0 rss: 60Mb 00:08:07.761 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.761 This may also happen if the target rejected all inputs we tried so far 00:08:07.761 [2024-11-18 19:06:26.157456] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.761 [2024-11-18 19:06:26.157487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.761 [2024-11-18 19:06:26.157545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.761 [2024-11-18 19:06:26.157564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.021 NEW_FUNC[1/671]: 0x44e908 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:08.021 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.021 #11 NEW cov: 11567 ft: 11573 corp: 2/18b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 4 InsertByte-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:08.021 [2024-11-18 19:06:26.478491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.021 [2024-11-18 19:06:26.478523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.021 [2024-11-18 19:06:26.478585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.021 [2024-11-18 19:06:26.478600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.021 [2024-11-18 19:06:26.478657] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.021 [2024-11-18 19:06:26.478670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.021 [2024-11-18 19:06:26.478727] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.021 [2024-11-18 19:06:26.478744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.021 #17 NEW cov: 11692 ft: 12475 corp: 3/48b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:08.021 [2024-11-18 19:06:26.528403] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.021 [2024-11-18 19:06:26.528429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.021 [2024-11-18 19:06:26.528484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.022 [2024-11-18 19:06:26.528500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.022 [2024-11-18 19:06:26.528558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.022 [2024-11-18 19:06:26.528578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.022 #21 NEW cov: 11698 ft: 12834 corp: 4/73b lim: 35 exec/s: 0 rss: 69Mb L: 25/30 MS: 4 ChangeBit-CMP-ChangeByte-InsertRepeatedBytes- DE: "\000\002"- 00:08:08.022 [2024-11-18 19:06:26.568377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.022 [2024-11-18 19:06:26.568403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.022 [2024-11-18 19:06:26.568460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.022 [2024-11-18 19:06:26.568474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.022 #22 NEW cov: 11783 ft: 13068 corp: 5/87b lim: 35 exec/s: 0 rss: 69Mb L: 14/30 MS: 1 CrossOver- 00:08:08.022 [2024-11-18 19:06:26.608329] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.022 [2024-11-18 19:06:26.608356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.281 #23 NEW cov: 11783 ft: 13845 corp: 6/100b lim: 35 exec/s: 0 rss: 69Mb L: 13/30 MS: 1 CrossOver- 00:08:08.282 [2024-11-18 19:06:26.648461] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.648490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.282 #24 NEW cov: 11783 ft: 13946 corp: 7/113b lim: 35 exec/s: 0 rss: 69Mb L: 13/30 MS: 1 ChangeByte- 00:08:08.282 [2024-11-18 19:06:26.688561] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.688588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.282 #30 NEW cov: 11783 ft: 14002 corp: 8/126b lim: 35 exec/s: 0 rss: 69Mb L: 13/30 MS: 1 ShuffleBytes- 00:08:08.282 [2024-11-18 19:06:26.728647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.728674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.282 #31 NEW cov: 11783 ft: 14047 corp: 9/137b lim: 35 exec/s: 0 rss: 69Mb L: 11/30 MS: 1 EraseBytes- 00:08:08.282 [2024-11-18 19:06:26.768948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.768975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.282 [2024-11-18 19:06:26.769033] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.769049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.282 #32 NEW cov: 11783 ft: 14076 corp: 10/151b lim: 35 exec/s: 0 rss: 69Mb L: 14/30 MS: 1 InsertByte- 00:08:08.282 [2024-11-18 19:06:26.809058] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.809083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.282 [2024-11-18 19:06:26.809139] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.809156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.282 #33 NEW cov: 11783 ft: 14112 corp: 11/167b lim: 35 exec/s: 0 rss: 69Mb L: 16/30 MS: 1 PersAutoDict- DE: "\000\002"- 00:08:08.282 [2024-11-18 19:06:26.849022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.282 [2024-11-18 19:06:26.849050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.282 #34 NEW cov: 11783 ft: 14130 corp: 12/180b lim: 35 exec/s: 0 rss: 70Mb L: 13/30 MS: 1 ChangeBit- 00:08:08.541 [2024-11-18 19:06:26.889134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.541 [2024-11-18 19:06:26.889160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.541 #35 NEW cov: 11783 ft: 14229 corp: 13/193b lim: 35 exec/s: 0 rss: 70Mb L: 13/30 MS: 1 ShuffleBytes- 00:08:08.541 [2024-11-18 19:06:26.929361] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.541 [2024-11-18 19:06:26.929388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.541 [2024-11-18 19:06:26.929448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.541 [2024-11-18 19:06:26.929463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.541 #36 NEW cov: 11783 ft: 14288 corp: 14/207b lim: 35 exec/s: 0 rss: 70Mb L: 14/30 MS: 1 CopyPart- 00:08:08.541 [2024-11-18 19:06:26.969332] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.541 [2024-11-18 19:06:26.969360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.541 #37 NEW cov: 11783 ft: 14337 corp: 15/220b lim: 35 exec/s: 0 rss: 70Mb L: 13/30 MS: 1 ShuffleBytes- 00:08:08.541 [2024-11-18 19:06:27.009625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000df SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.541 [2024-11-18 19:06:27.009652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.541 [2024-11-18 19:06:27.009722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.541 [2024-11-18 19:06:27.009738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.541 #38 NEW cov: 11783 ft: 14396 corp: 16/234b lim: 35 exec/s: 0 rss: 70Mb L: 14/30 MS: 1 ChangeBit- 00:08:08.541 [2024-11-18 19:06:27.049618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.541 [2024-11-18 19:06:27.049645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.541 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.542 #39 NEW cov: 11806 ft: 14489 corp: 17/245b lim: 35 exec/s: 0 rss: 70Mb L: 11/30 MS: 1 ChangeByte- 00:08:08.542 [2024-11-18 19:06:27.089734] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.542 [2024-11-18 19:06:27.089761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.542 #40 NEW cov: 11806 ft: 14513 corp: 18/258b lim: 35 exec/s: 0 rss: 70Mb L: 13/30 MS: 1 ChangeByte- 00:08:08.542 [2024-11-18 19:06:27.129856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.542 [2024-11-18 19:06:27.129885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 #41 NEW cov: 11806 ft: 14552 corp: 19/271b lim: 35 exec/s: 41 rss: 70Mb L: 13/30 MS: 1 CMP- DE: "\000\000\000\005"- 00:08:08.801 [2024-11-18 19:06:27.169976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.170004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 #42 NEW cov: 11806 ft: 14570 corp: 20/284b lim: 35 exec/s: 42 rss: 70Mb L: 13/30 MS: 1 CrossOver- 00:08:08.801 [2024-11-18 19:06:27.210059] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000df SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.210087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 #43 NEW cov: 11806 ft: 14584 corp: 21/291b lim: 35 exec/s: 43 rss: 70Mb L: 7/30 MS: 1 CrossOver- 00:08:08.801 [2024-11-18 19:06:27.250183] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.250210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 #44 NEW cov: 11806 ft: 14605 corp: 22/303b lim: 35 exec/s: 44 rss: 70Mb L: 12/30 MS: 1 EraseBytes- 00:08:08.801 [2024-11-18 19:06:27.290420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.290447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-11-18 19:06:27.290506] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.290519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 #45 NEW cov: 11806 ft: 14612 corp: 23/318b lim: 35 exec/s: 45 rss: 70Mb L: 15/30 MS: 1 PersAutoDict- DE: "\000\002"- 00:08:08.801 [2024-11-18 19:06:27.330379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.330405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 #46 NEW cov: 11806 ft: 14632 corp: 24/328b lim: 35 exec/s: 46 rss: 70Mb L: 10/30 MS: 1 EraseBytes- 00:08:08.801 [2024-11-18 19:06:27.370977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.371004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.801 [2024-11-18 19:06:27.371063] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.371079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.801 [2024-11-18 19:06:27.371133] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.371147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.801 [2024-11-18 19:06:27.371204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.801 [2024-11-18 19:06:27.371219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.801 #47 NEW cov: 11806 ft: 14655 corp: 25/358b lim: 35 exec/s: 47 rss: 70Mb L: 30/30 MS: 1 CrossOver- 00:08:09.061 [2024-11-18 19:06:27.411072] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.411100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 [2024-11-18 19:06:27.411157] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.411173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.061 [2024-11-18 19:06:27.411231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.411245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.061 [2024-11-18 19:06:27.411301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.411315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.061 #48 NEW cov: 11806 ft: 14701 corp: 26/386b lim: 35 exec/s: 48 rss: 70Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:08:09.061 [2024-11-18 19:06:27.450724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.450751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 #49 NEW cov: 11806 ft: 14707 corp: 27/399b lim: 35 exec/s: 49 rss: 70Mb L: 13/30 MS: 1 ChangeBit- 00:08:09.061 [2024-11-18 19:06:27.490998] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.491026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 [2024-11-18 19:06:27.491083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.491098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.061 #50 NEW cov: 11806 ft: 14712 corp: 28/416b lim: 35 exec/s: 50 rss: 70Mb L: 17/30 MS: 1 PersAutoDict- DE: "\000\002"- 00:08:09.061 [2024-11-18 19:06:27.530954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.530981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 #51 NEW cov: 11806 ft: 14735 corp: 29/429b lim: 35 exec/s: 51 rss: 70Mb L: 13/30 MS: 1 ChangeByte- 00:08:09.061 [2024-11-18 19:06:27.571208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.571235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 [2024-11-18 19:06:27.571293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000084 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.571308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.061 #52 NEW cov: 11806 ft: 14750 corp: 30/448b lim: 35 exec/s: 52 rss: 70Mb L: 19/30 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\004"- 00:08:09.061 [2024-11-18 19:06:27.611198] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.611227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.061 #53 NEW cov: 11806 ft: 14760 corp: 31/460b lim: 35 exec/s: 53 rss: 70Mb L: 12/30 MS: 1 ChangeBinInt- 00:08:09.061 [2024-11-18 19:06:27.651322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000df SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.061 [2024-11-18 19:06:27.651349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 #54 NEW cov: 11806 ft: 14777 corp: 32/467b lim: 35 exec/s: 54 rss: 70Mb L: 7/30 MS: 1 CMP- DE: "\004\000\000\000"- 00:08:09.321 [2024-11-18 19:06:27.691429] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000df SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.691455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 #55 NEW cov: 11806 ft: 14779 corp: 33/474b lim: 35 exec/s: 55 rss: 70Mb L: 7/30 MS: 1 CrossOver- 00:08:09.321 [2024-11-18 19:06:27.732016] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.732044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.732103] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000079 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.732118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.732175] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.732189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.732244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.732258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.321 #56 NEW cov: 11806 ft: 14824 corp: 34/502b lim: 35 exec/s: 56 rss: 70Mb L: 28/30 MS: 1 ChangeBinInt- 00:08:09.321 [2024-11-18 19:06:27.771793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.771820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.771882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.771897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 #57 NEW cov: 11806 ft: 14843 corp: 35/516b lim: 35 exec/s: 57 rss: 70Mb L: 14/30 MS: 1 CopyPart- 00:08:09.321 [2024-11-18 19:06:27.812234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.812261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.812319] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000079 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.812332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.812389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.812408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.812466] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.812479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.321 #58 NEW cov: 11806 ft: 14866 corp: 36/544b lim: 35 exec/s: 58 rss: 70Mb L: 28/30 MS: 1 PersAutoDict- DE: "\004\000\000\000"- 00:08:09.321 [2024-11-18 19:06:27.852212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.852240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.852297] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.852311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.321 [2024-11-18 19:06:27.852366] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.852381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.321 #59 NEW cov: 11806 ft: 14877 corp: 37/571b lim: 35 exec/s: 59 rss: 70Mb L: 27/30 MS: 1 CopyPart- 00:08:09.321 [2024-11-18 19:06:27.891973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.321 [2024-11-18 19:06:27.892000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.321 #60 NEW cov: 11806 ft: 14939 corp: 38/581b lim: 35 exec/s: 60 rss: 70Mb L: 10/30 MS: 1 CrossOver- 00:08:09.582 [2024-11-18 19:06:27.932614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:27.932642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.582 [2024-11-18 19:06:27.932698] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000001a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:27.932715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.582 [2024-11-18 19:06:27.932771] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:6 cdw10:0000000c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:27.932785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.582 [2024-11-18 19:06:27.932839] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:27.932855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.582 #61 NEW cov: 11806 ft: 14942 corp: 39/611b lim: 35 exec/s: 61 rss: 70Mb L: 30/30 MS: 1 ChangeByte- 00:08:09.582 [2024-11-18 19:06:27.972208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:27.972235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.582 #62 NEW cov: 11806 ft: 14951 corp: 40/624b lim: 35 exec/s: 62 rss: 70Mb L: 13/30 MS: 1 ShuffleBytes- 00:08:09.582 [2024-11-18 19:06:28.012469] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:28.012498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.582 [2024-11-18 19:06:28.012561] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:28.012576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.582 #63 NEW cov: 11806 ft: 14952 corp: 41/642b lim: 35 exec/s: 63 rss: 70Mb L: 18/30 MS: 1 CopyPart- 00:08:09.582 [2024-11-18 19:06:28.042578] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:28.042604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.582 [2024-11-18 19:06:28.042663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:28.042679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.582 #64 NEW cov: 11806 ft: 14991 corp: 42/656b lim: 35 exec/s: 64 rss: 70Mb L: 14/30 MS: 1 ShuffleBytes- 00:08:09.582 [2024-11-18 19:06:28.082544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000df SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:28.082574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.582 #65 NEW cov: 11806 ft: 14998 corp: 43/667b lim: 35 exec/s: 65 rss: 70Mb L: 11/30 MS: 1 CrossOver- 00:08:09.582 [2024-11-18 19:06:28.122665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.582 [2024-11-18 19:06:28.122692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.582 #66 NEW cov: 11806 ft: 15001 corp: 44/680b lim: 35 exec/s: 33 rss: 71Mb L: 13/30 MS: 1 ShuffleBytes- 00:08:09.582 #66 DONE cov: 11806 ft: 15001 corp: 44/680b lim: 35 exec/s: 33 rss: 71Mb 00:08:09.582 ###### Recommended dictionary. ###### 00:08:09.582 "\000\002" # Uses: 3 00:08:09.582 "\000\000\000\005" # Uses: 0 00:08:09.582 "\000\000\000\000\000\000\000\004" # Uses: 0 00:08:09.582 "\004\000\000\000" # Uses: 1 00:08:09.582 ###### End of recommended dictionary. ###### 00:08:09.582 Done 66 runs in 2 second(s) 00:08:09.842 19:06:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:09.842 19:06:28 -- ../common.sh@72 -- # (( i++ )) 00:08:09.842 19:06:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.842 19:06:28 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:09.842 19:06:28 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:09.842 19:06:28 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.842 19:06:28 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.842 19:06:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:09.842 19:06:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:09.842 19:06:28 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:09.842 19:06:28 -- nvmf/run.sh@29 -- # port=4415 00:08:09.842 19:06:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:09.842 19:06:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:09.842 19:06:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.842 19:06:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:09.842 [2024-11-18 19:06:28.314900] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:09.842 [2024-11-18 19:06:28.314967] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1305931 ] 00:08:09.842 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.101 [2024-11-18 19:06:28.496908] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.101 [2024-11-18 19:06:28.561125] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.101 [2024-11-18 19:06:28.561241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.101 [2024-11-18 19:06:28.619195] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.101 [2024-11-18 19:06:28.635520] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:10.101 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.101 INFO: Seed: 2972144639 00:08:10.101 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:10.101 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:10.101 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:10.101 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.101 #2 INITED exec/s: 0 rss: 60Mb 00:08:10.101 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.101 This may also happen if the target rejected all inputs we tried so far 00:08:10.101 [2024-11-18 19:06:28.680847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.101 [2024-11-18 19:06:28.680877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.101 [2024-11-18 19:06:28.680934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.101 [2024-11-18 19:06:28.680948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.619 NEW_FUNC[1/670]: 0x44fe48 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:10.619 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.619 #5 NEW cov: 11558 ft: 11559 corp: 2/17b lim: 35 exec/s: 0 rss: 68Mb L: 16/16 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:08:10.619 [2024-11-18 19:06:29.001650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.619 [2024-11-18 19:06:29.001681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 [2024-11-18 19:06:29.001740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.619 [2024-11-18 19:06:29.001755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.619 #6 NEW cov: 11673 ft: 12103 corp: 3/33b lim: 35 exec/s: 0 rss: 68Mb L: 16/16 MS: 1 ChangeBinInt- 00:08:10.619 [2024-11-18 19:06:29.041963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.619 [2024-11-18 19:06:29.041990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 [2024-11-18 19:06:29.042053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.619 [2024-11-18 19:06:29.042068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.619 [2024-11-18 19:06:29.042126] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.619 [2024-11-18 19:06:29.042144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.619 [2024-11-18 19:06:29.042205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.619 [2024-11-18 19:06:29.042220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.619 #10 NEW cov: 11679 ft: 12768 corp: 4/67b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 4 ChangeBinInt-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:10.619 [2024-11-18 19:06:29.081779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.619 [2024-11-18 19:06:29.081804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.619 [2024-11-18 19:06:29.081866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.081881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.620 #16 NEW cov: 11764 ft: 13000 corp: 5/83b lim: 35 exec/s: 0 rss: 68Mb L: 16/34 MS: 1 CrossOver- 00:08:10.620 [2024-11-18 19:06:29.122145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.122170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.122229] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.122244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.122304] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.122318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.122377] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.122390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.620 #17 NEW cov: 11764 ft: 13274 corp: 6/117b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:10.620 [2024-11-18 19:06:29.162410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.162435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.162496] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.162511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.162570] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.162584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.162645] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.162659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.162719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.162734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.620 #18 NEW cov: 11764 ft: 13400 corp: 7/152b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertByte- 00:08:10.620 [2024-11-18 19:06:29.202141] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.202166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.620 [2024-11-18 19:06:29.202228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.620 [2024-11-18 19:06:29.202243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 #19 NEW cov: 11764 ft: 13482 corp: 8/168b lim: 35 exec/s: 0 rss: 68Mb L: 16/35 MS: 1 ChangeBinInt- 00:08:10.879 [2024-11-18 19:06:29.242398] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.242424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.242485] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.242500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.242563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.242578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.879 #20 NEW cov: 11764 ft: 13767 corp: 9/194b lim: 35 exec/s: 0 rss: 68Mb L: 26/35 MS: 1 CrossOver- 00:08:10.879 [2024-11-18 19:06:29.282761] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.282786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.282849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.282864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.282923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.282937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.282999] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.283013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.283073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.283087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.879 #21 NEW cov: 11764 ft: 13837 corp: 10/229b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 CrossOver- 00:08:10.879 [2024-11-18 19:06:29.322520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.322559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.322624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.322637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 #22 NEW cov: 11764 ft: 13882 corp: 11/248b lim: 35 exec/s: 0 rss: 68Mb L: 19/35 MS: 1 EraseBytes- 00:08:10.879 [2024-11-18 19:06:29.362643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.362668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.362729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.362743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 #23 NEW cov: 11764 ft: 13917 corp: 12/265b lim: 35 exec/s: 0 rss: 68Mb L: 17/35 MS: 1 InsertByte- 00:08:10.879 [2024-11-18 19:06:29.403151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.403177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.403237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.403252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.403313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.403327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.403386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.403400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.403459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.403472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.879 #24 NEW cov: 11764 ft: 13950 corp: 13/300b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:10.879 [2024-11-18 19:06:29.443254] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.443280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.443342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.443356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.443415] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.443429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.443491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.443508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.879 [2024-11-18 19:06:29.443572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.879 [2024-11-18 19:06:29.443586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.879 #25 NEW cov: 11764 ft: 13991 corp: 14/335b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertByte- 00:08:11.138 [2024-11-18 19:06:29.483472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.138 [2024-11-18 19:06:29.483498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.483561] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.483576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.483635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.483649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.483711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.483725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.483787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.483801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.139 #26 NEW cov: 11764 ft: 14009 corp: 15/370b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:11.139 [2024-11-18 19:06:29.523107] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.523132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.523193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.523208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 #27 NEW cov: 11764 ft: 14038 corp: 16/386b lim: 35 exec/s: 0 rss: 69Mb L: 16/35 MS: 1 ChangeBit- 00:08:11.139 [2024-11-18 19:06:29.563347] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.563372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.563449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.563463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.563523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.563537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.139 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.139 #28 NEW cov: 11787 ft: 14079 corp: 17/412b lim: 35 exec/s: 0 rss: 69Mb L: 26/35 MS: 1 CrossOver- 00:08:11.139 [2024-11-18 19:06:29.603636] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.603662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.603725] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.603740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.603798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000057e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.603812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.603870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.603885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.139 #29 NEW cov: 11787 ft: 14101 corp: 18/446b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 CopyPart- 00:08:11.139 [2024-11-18 19:06:29.643471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.643496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.643562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.643577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 #30 NEW cov: 11787 ft: 14151 corp: 19/462b lim: 35 exec/s: 0 rss: 69Mb L: 16/35 MS: 1 ShuffleBytes- 00:08:11.139 [2024-11-18 19:06:29.683848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.683873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.683933] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.683947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.684006] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000057e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.684021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.684079] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.684094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.139 #31 NEW cov: 11787 ft: 14163 corp: 20/496b lim: 35 exec/s: 31 rss: 69Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:11.139 [2024-11-18 19:06:29.723721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.723745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.139 [2024-11-18 19:06:29.723806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.139 [2024-11-18 19:06:29.723823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.398 #32 NEW cov: 11787 ft: 14192 corp: 21/512b lim: 35 exec/s: 32 rss: 69Mb L: 16/35 MS: 1 ChangeBinInt- 00:08:11.398 [2024-11-18 19:06:29.763841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.398 [2024-11-18 19:06:29.763867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.398 [2024-11-18 19:06:29.763925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.763940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.399 #33 NEW cov: 11787 ft: 14210 corp: 22/529b lim: 35 exec/s: 33 rss: 69Mb L: 17/35 MS: 1 CrossOver- 00:08:11.399 [2024-11-18 19:06:29.804361] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.804386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.804447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.804461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.804521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.804535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.804597] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.804612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.804672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.804685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.399 #34 NEW cov: 11787 ft: 14226 corp: 23/564b lim: 35 exec/s: 34 rss: 69Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:11.399 [2024-11-18 19:06:29.844476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.844501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.844564] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.844579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.844637] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.844652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.844711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.844725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.844790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.844804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.399 #35 NEW cov: 11787 ft: 14298 corp: 24/599b lim: 35 exec/s: 35 rss: 69Mb L: 35/35 MS: 1 ChangeByte- 00:08:11.399 [2024-11-18 19:06:29.884622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.884647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.884708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.884723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.884784] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.884798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.884857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.884872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.884932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.884946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.399 #36 NEW cov: 11787 ft: 14310 corp: 25/634b lim: 35 exec/s: 36 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:08:11.399 [2024-11-18 19:06:29.924318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.924343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.924404] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.924417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.399 #37 NEW cov: 11787 ft: 14319 corp: 26/652b lim: 35 exec/s: 37 rss: 70Mb L: 18/35 MS: 1 EraseBytes- 00:08:11.399 [2024-11-18 19:06:29.964840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.964865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.964926] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.964941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.965000] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.965014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.965073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.965087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.399 [2024-11-18 19:06:29.965152] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.399 [2024-11-18 19:06:29.965166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.399 #38 NEW cov: 11787 ft: 14337 corp: 27/687b lim: 35 exec/s: 38 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:08:11.659 [2024-11-18 19:06:30.004965] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.004992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.005055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.005069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.005131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.005146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.005205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.005220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.005279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.005294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.659 #39 NEW cov: 11787 ft: 14349 corp: 28/722b lim: 35 exec/s: 39 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:08:11.659 [2024-11-18 19:06:30.045113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000071f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.045139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.045201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.045217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.045275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.045290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.045351] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.045365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.045428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.045442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.659 #40 NEW cov: 11787 ft: 14353 corp: 29/757b lim: 35 exec/s: 40 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:08:11.659 [2024-11-18 19:06:30.084989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.085019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.085080] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.085095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.085157] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.085172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.659 #41 NEW cov: 11787 ft: 14356 corp: 30/784b lim: 35 exec/s: 41 rss: 70Mb L: 27/35 MS: 1 InsertByte- 00:08:11.659 [2024-11-18 19:06:30.125172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.125198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.125258] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.125273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.659 [2024-11-18 19:06:30.125335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000057e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.659 [2024-11-18 19:06:30.125350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.660 [2024-11-18 19:06:30.125410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.660 [2024-11-18 19:06:30.125424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.660 #42 NEW cov: 11787 ft: 14416 corp: 31/818b lim: 35 exec/s: 42 rss: 70Mb L: 34/35 MS: 1 ChangeBit- 00:08:11.660 [2024-11-18 19:06:30.165003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.660 [2024-11-18 19:06:30.165029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.660 [2024-11-18 19:06:30.165091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.660 [2024-11-18 19:06:30.165106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.660 #43 NEW cov: 11787 ft: 14421 corp: 32/837b lim: 35 exec/s: 43 rss: 70Mb L: 19/35 MS: 1 ChangeBinInt- 00:08:11.660 [2024-11-18 19:06:30.205124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.660 [2024-11-18 19:06:30.205150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.660 [2024-11-18 19:06:30.205214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.660 [2024-11-18 19:06:30.205228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.660 #44 NEW cov: 11787 ft: 14424 corp: 33/857b lim: 35 exec/s: 44 rss: 70Mb L: 20/35 MS: 1 CrossOver- 00:08:11.660 [2024-11-18 19:06:30.245253] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.660 [2024-11-18 19:06:30.245279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.660 [2024-11-18 19:06:30.245342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.660 [2024-11-18 19:06:30.245357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.920 #45 NEW cov: 11787 ft: 14453 corp: 34/877b lim: 35 exec/s: 45 rss: 70Mb L: 20/35 MS: 1 EraseBytes- 00:08:11.920 [2024-11-18 19:06:30.285762] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.285788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.285847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.285862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.285923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.285937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.285998] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.286012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.286072] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.286086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.920 #46 NEW cov: 11787 ft: 14467 corp: 35/912b lim: 35 exec/s: 46 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:11.920 [2024-11-18 19:06:30.325887] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.325913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.325975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.325990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.326050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.326064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.326127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.326141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.326204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:0000073f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.326218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.920 #47 NEW cov: 11787 ft: 14484 corp: 36/947b lim: 35 exec/s: 47 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:08:11.920 [2024-11-18 19:06:30.365912] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.365937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.366003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006de SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.366019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.366079] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006de SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.366093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.366153] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.366167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.920 #48 NEW cov: 11787 ft: 14505 corp: 37/978b lim: 35 exec/s: 48 rss: 70Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:08:11.920 [2024-11-18 19:06:30.405724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.405750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.405813] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.405827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.920 #49 NEW cov: 11787 ft: 14511 corp: 38/996b lim: 35 exec/s: 49 rss: 70Mb L: 18/35 MS: 1 ChangeBit- 00:08:11.920 [2024-11-18 19:06:30.446271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000071f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.446297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.920 [2024-11-18 19:06:30.446357] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.920 [2024-11-18 19:06:30.446370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.921 [2024-11-18 19:06:30.446431] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.446445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.921 [2024-11-18 19:06:30.446506] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.446519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.921 [2024-11-18 19:06:30.446582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.446597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.921 #50 NEW cov: 11787 ft: 14524 corp: 39/1031b lim: 35 exec/s: 50 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\000\000"- 00:08:11.921 [2024-11-18 19:06:30.486363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.486388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.921 [2024-11-18 19:06:30.486451] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.486469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.921 [2024-11-18 19:06:30.486531] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.486546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.921 [2024-11-18 19:06:30.486615] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.486629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.921 [2024-11-18 19:06:30.486692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.921 [2024-11-18 19:06:30.486706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:11.921 #51 NEW cov: 11787 ft: 14533 corp: 40/1066b lim: 35 exec/s: 51 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:08:12.181 [2024-11-18 19:06:30.526046] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.526072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.526135] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.526150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.181 #52 NEW cov: 11787 ft: 14552 corp: 41/1086b lim: 35 exec/s: 52 rss: 70Mb L: 20/35 MS: 1 CrossOver- 00:08:12.181 [2024-11-18 19:06:30.566569] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.566595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.566658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.566673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.566735] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.566749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.566810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.566825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.566882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.566896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.181 #53 NEW cov: 11787 ft: 14578 corp: 42/1121b lim: 35 exec/s: 53 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:08:12.181 [2024-11-18 19:06:30.606286] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005b1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.606312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.606374] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000589 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.606392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.181 #54 NEW cov: 11787 ft: 14582 corp: 43/1138b lim: 35 exec/s: 54 rss: 70Mb L: 17/35 MS: 1 ChangeByte- 00:08:12.181 [2024-11-18 19:06:30.646830] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000071f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.646857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.646920] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.646935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.646994] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.647008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.647070] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.647084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.647144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.647158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.181 #55 NEW cov: 11787 ft: 14592 corp: 44/1173b lim: 35 exec/s: 55 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:12.181 [2024-11-18 19:06:30.686576] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000075f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.686602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.181 [2024-11-18 19:06:30.686663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.181 [2024-11-18 19:06:30.686677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.181 #56 NEW cov: 11787 ft: 14617 corp: 45/1192b lim: 35 exec/s: 28 rss: 70Mb L: 19/35 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:12.181 #56 DONE cov: 11787 ft: 14617 corp: 45/1192b lim: 35 exec/s: 28 rss: 70Mb 00:08:12.181 ###### Recommended dictionary. ###### 00:08:12.181 "\000\000" # Uses: 1 00:08:12.181 ###### End of recommended dictionary. ###### 00:08:12.181 Done 56 runs in 2 second(s) 00:08:12.441 19:06:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:12.441 19:06:30 -- ../common.sh@72 -- # (( i++ )) 00:08:12.441 19:06:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.441 19:06:30 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:12.441 19:06:30 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:12.441 19:06:30 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.441 19:06:30 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.441 19:06:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:12.441 19:06:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:12.441 19:06:30 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:12.441 19:06:30 -- nvmf/run.sh@29 -- # port=4416 00:08:12.441 19:06:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:12.441 19:06:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:12.441 19:06:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.441 19:06:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:12.441 [2024-11-18 19:06:30.875495] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:12.441 [2024-11-18 19:06:30.875568] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1306467 ] 00:08:12.441 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.701 [2024-11-18 19:06:31.049859] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.701 [2024-11-18 19:06:31.113191] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.701 [2024-11-18 19:06:31.113306] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.701 [2024-11-18 19:06:31.171209] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.701 [2024-11-18 19:06:31.187541] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:12.701 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.701 INFO: Seed: 1230161642 00:08:12.701 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:12.701 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:12.701 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:12.701 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.701 #2 INITED exec/s: 0 rss: 60Mb 00:08:12.701 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.701 This may also happen if the target rejected all inputs we tried so far 00:08:12.701 [2024-11-18 19:06:31.232150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.701 [2024-11-18 19:06:31.232185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.701 [2024-11-18 19:06:31.232246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.701 [2024-11-18 19:06:31.232271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.961 NEW_FUNC[1/667]: 0x451308 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:12.961 NEW_FUNC[2/667]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:12.961 #10 NEW cov: 11633 ft: 11664 corp: 2/43b lim: 105 exec/s: 0 rss: 68Mb L: 42/42 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:12.961 [2024-11-18 19:06:31.552901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.961 [2024-11-18 19:06:31.552938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.961 [2024-11-18 19:06:31.552990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.961 [2024-11-18 19:06:31.553008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.220 NEW_FUNC[1/4]: 0x1c582d8 in spdk_thread_is_exited /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:728 00:08:13.220 NEW_FUNC[2/4]: 0x1c59068 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:797 00:08:13.220 #11 NEW cov: 11776 ft: 12064 corp: 3/86b lim: 105 exec/s: 0 rss: 69Mb L: 43/43 MS: 1 InsertByte- 00:08:13.220 [2024-11-18 19:06:31.622966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.622999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.220 [2024-11-18 19:06:31.623049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.623070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.220 #12 NEW cov: 11782 ft: 12448 corp: 4/131b lim: 105 exec/s: 0 rss: 69Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:13.220 [2024-11-18 19:06:31.673068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17940362859850037496 len:63737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.673097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.220 [2024-11-18 19:06:31.673129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17940362863843014904 len:63737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.673145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.220 #13 NEW cov: 11867 ft: 12689 corp: 5/185b lim: 105 exec/s: 0 rss: 69Mb L: 54/54 MS: 1 InsertRepeatedBytes- 00:08:13.220 [2024-11-18 19:06:31.723185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.723214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.220 [2024-11-18 19:06:31.723262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.723286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.220 #14 NEW cov: 11867 ft: 12888 corp: 6/230b lim: 105 exec/s: 0 rss: 69Mb L: 45/54 MS: 1 CopyPart- 00:08:13.220 [2024-11-18 19:06:31.793409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.793440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.220 [2024-11-18 19:06:31.793491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8574853690513424384 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.220 [2024-11-18 19:06:31.793511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.479 #15 NEW cov: 11867 ft: 13000 corp: 7/273b lim: 105 exec/s: 0 rss: 69Mb L: 43/54 MS: 1 ChangeByte- 00:08:13.479 [2024-11-18 19:06:31.863571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:31.863599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.479 [2024-11-18 19:06:31.863647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:31.863672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.479 #16 NEW cov: 11867 ft: 13071 corp: 8/318b lim: 105 exec/s: 0 rss: 69Mb L: 45/54 MS: 1 ShuffleBytes- 00:08:13.479 [2024-11-18 19:06:31.913698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:31.913731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.479 [2024-11-18 19:06:31.913780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:31.913802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.479 #17 NEW cov: 11867 ft: 13092 corp: 9/361b lim: 105 exec/s: 0 rss: 69Mb L: 43/54 MS: 1 InsertByte- 00:08:13.479 [2024-11-18 19:06:31.963832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:31.963861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.479 [2024-11-18 19:06:31.963909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:31.963932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.479 #18 NEW cov: 11867 ft: 13120 corp: 10/405b lim: 105 exec/s: 0 rss: 69Mb L: 44/54 MS: 1 InsertByte- 00:08:13.479 [2024-11-18 19:06:32.014009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:32.014037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.479 [2024-11-18 19:06:32.014085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:4294901760 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.479 [2024-11-18 19:06:32.014109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.479 #19 NEW cov: 11867 ft: 13216 corp: 11/450b lim: 105 exec/s: 0 rss: 69Mb L: 45/54 MS: 1 CrossOver- 00:08:13.480 [2024-11-18 19:06:32.064096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.480 [2024-11-18 19:06:32.064126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.480 [2024-11-18 19:06:32.064176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.480 [2024-11-18 19:06:32.064199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.739 #20 NEW cov: 11867 ft: 13333 corp: 12/493b lim: 105 exec/s: 0 rss: 69Mb L: 43/54 MS: 1 ChangeBinInt- 00:08:13.739 [2024-11-18 19:06:32.114257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.114287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.739 [2024-11-18 19:06:32.114320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.114338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.739 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.739 #21 NEW cov: 11884 ft: 13383 corp: 13/536b lim: 105 exec/s: 0 rss: 69Mb L: 43/54 MS: 1 ChangeBinInt- 00:08:13.739 [2024-11-18 19:06:32.164357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.164386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.739 [2024-11-18 19:06:32.164439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.164457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.739 #22 NEW cov: 11884 ft: 13411 corp: 14/579b lim: 105 exec/s: 0 rss: 69Mb L: 43/54 MS: 1 InsertByte- 00:08:13.739 [2024-11-18 19:06:32.214520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.214557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.739 [2024-11-18 19:06:32.214592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8574853690513424384 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.214617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.739 #23 NEW cov: 11884 ft: 13554 corp: 15/622b lim: 105 exec/s: 23 rss: 69Mb L: 43/54 MS: 1 ChangeBinInt- 00:08:13.739 [2024-11-18 19:06:32.274702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.274731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.739 [2024-11-18 19:06:32.274779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.274803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.739 #24 NEW cov: 11884 ft: 13601 corp: 16/666b lim: 105 exec/s: 24 rss: 69Mb L: 44/54 MS: 1 InsertByte- 00:08:13.739 [2024-11-18 19:06:32.334885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.334915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.739 [2024-11-18 19:06:32.334949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.739 [2024-11-18 19:06:32.334966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.999 #25 NEW cov: 11884 ft: 13633 corp: 17/717b lim: 105 exec/s: 25 rss: 69Mb L: 51/54 MS: 1 CopyPart- 00:08:13.999 [2024-11-18 19:06:32.395011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17940362859850037496 len:63737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.395040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.999 [2024-11-18 19:06:32.395089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18084478051919265791 len:63737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.395111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.999 #26 NEW cov: 11884 ft: 13675 corp: 18/775b lim: 105 exec/s: 26 rss: 70Mb L: 58/58 MS: 1 CMP- DE: "\376\377\377\372"- 00:08:13.999 [2024-11-18 19:06:32.455214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.455242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.999 [2024-11-18 19:06:32.455290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:6510615555426900570 len:23131 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.455313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.999 [2024-11-18 19:06:32.455343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:130841883705344 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.455359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.999 #27 NEW cov: 11884 ft: 13999 corp: 19/841b lim: 105 exec/s: 27 rss: 70Mb L: 66/66 MS: 1 InsertRepeatedBytes- 00:08:13.999 [2024-11-18 19:06:32.505311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562641642095491 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.505339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.999 [2024-11-18 19:06:32.505387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.505410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.999 [2024-11-18 19:06:32.505440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.505456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.999 #31 NEW cov: 11884 ft: 14003 corp: 20/906b lim: 105 exec/s: 31 rss: 70Mb L: 65/66 MS: 4 CopyPart-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:13.999 [2024-11-18 19:06:32.555435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.555464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.999 [2024-11-18 19:06:32.555511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.555534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.999 [2024-11-18 19:06:32.555573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.999 [2024-11-18 19:06:32.555590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.259 #32 NEW cov: 11884 ft: 14009 corp: 21/975b lim: 105 exec/s: 32 rss: 70Mb L: 69/69 MS: 1 CrossOver- 00:08:14.259 [2024-11-18 19:06:32.625623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.625652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.259 [2024-11-18 19:06:32.625700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1099511627776 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.625722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.259 #33 NEW cov: 11884 ft: 14023 corp: 22/1017b lim: 105 exec/s: 33 rss: 70Mb L: 42/69 MS: 1 ChangeBit- 00:08:14.259 [2024-11-18 19:06:32.675678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.675707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.259 #34 NEW cov: 11884 ft: 14498 corp: 23/1057b lim: 105 exec/s: 34 rss: 70Mb L: 40/69 MS: 1 EraseBytes- 00:08:14.259 [2024-11-18 19:06:32.746020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562641642095491 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.746049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.259 [2024-11-18 19:06:32.746096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.746121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.259 [2024-11-18 19:06:32.746151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562598838371203 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.746168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.259 [2024-11-18 19:06:32.746196] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:8753160913407277433 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.746212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.259 #35 NEW cov: 11884 ft: 15010 corp: 24/1148b lim: 105 exec/s: 35 rss: 70Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:08:14.259 [2024-11-18 19:06:32.816717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.816746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.259 [2024-11-18 19:06:32.816792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.816812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.259 #36 NEW cov: 11884 ft: 15072 corp: 25/1197b lim: 105 exec/s: 36 rss: 70Mb L: 49/91 MS: 1 PersAutoDict- DE: "\376\377\377\372"- 00:08:14.259 [2024-11-18 19:06:32.856950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2810246168720703488 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.856978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.259 [2024-11-18 19:06:32.857017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.857032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.259 [2024-11-18 19:06:32.857085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.259 [2024-11-18 19:06:32.857100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.519 #37 NEW cov: 11884 ft: 15081 corp: 26/1267b lim: 105 exec/s: 37 rss: 70Mb L: 70/91 MS: 1 InsertByte- 00:08:14.519 [2024-11-18 19:06:32.897175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.897202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:32.897246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.897270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:32.897324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:13961653357748797889 len:49602 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.897339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:32.897393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:13961653357748797889 len:49602 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.897407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.519 #38 NEW cov: 11884 ft: 15161 corp: 27/1363b lim: 105 exec/s: 38 rss: 70Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:14.519 [2024-11-18 19:06:32.937084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.937110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:32.937165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1099511627776 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.937183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.519 #39 NEW cov: 11884 ft: 15195 corp: 28/1405b lim: 105 exec/s: 39 rss: 70Mb L: 42/96 MS: 1 ChangeBinInt- 00:08:14.519 [2024-11-18 19:06:32.977167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.977192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:32.977230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:32.977246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.519 #40 NEW cov: 11884 ft: 15213 corp: 29/1455b lim: 105 exec/s: 40 rss: 70Mb L: 50/96 MS: 1 CopyPart- 00:08:14.519 [2024-11-18 19:06:33.017268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:33.017295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:33.017338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:33.017354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.519 #41 NEW cov: 11884 ft: 15228 corp: 30/1504b lim: 105 exec/s: 41 rss: 70Mb L: 49/96 MS: 1 PersAutoDict- DE: "\376\377\377\372"- 00:08:14.519 [2024-11-18 19:06:33.057465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:33.057492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:33.057531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:33.057546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:33.057608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:33.057627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.519 #42 NEW cov: 11884 ft: 15242 corp: 31/1579b lim: 105 exec/s: 42 rss: 70Mb L: 75/96 MS: 1 CopyPart- 00:08:14.519 [2024-11-18 19:06:33.097517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:33.097544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.519 [2024-11-18 19:06:33.097587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.519 [2024-11-18 19:06:33.097611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.779 #43 NEW cov: 11891 ft: 15260 corp: 32/1622b lim: 105 exec/s: 43 rss: 70Mb L: 43/96 MS: 1 CopyPart- 00:08:14.779 [2024-11-18 19:06:33.137883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9476562641642095491 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.779 [2024-11-18 19:06:33.137910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.779 [2024-11-18 19:06:33.137956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:9476562641788044163 len:33668 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.779 [2024-11-18 19:06:33.137978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.779 [2024-11-18 19:06:33.138030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9476562598838371203 len:31090 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.779 [2024-11-18 19:06:33.138046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.779 [2024-11-18 19:06:33.138099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:8753160913407277433 len:31098 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.779 [2024-11-18 19:06:33.138114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.779 #44 NEW cov: 11891 ft: 15265 corp: 33/1722b lim: 105 exec/s: 44 rss: 70Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:14.779 [2024-11-18 19:06:33.177777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1241513984 len:9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.779 [2024-11-18 19:06:33.177803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.779 [2024-11-18 19:06:33.177853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.779 [2024-11-18 19:06:33.177871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.779 #45 NEW cov: 11891 ft: 15328 corp: 34/1769b lim: 105 exec/s: 45 rss: 70Mb L: 47/100 MS: 1 CrossOver- 00:08:14.779 [2024-11-18 19:06:33.217760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.779 [2024-11-18 19:06:33.217787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.779 #46 NEW cov: 11891 ft: 15373 corp: 35/1809b lim: 105 exec/s: 23 rss: 70Mb L: 40/100 MS: 1 ChangeBinInt- 00:08:14.779 #46 DONE cov: 11891 ft: 15373 corp: 35/1809b lim: 105 exec/s: 23 rss: 70Mb 00:08:14.779 ###### Recommended dictionary. ###### 00:08:14.779 "\376\377\377\372" # Uses: 2 00:08:14.779 ###### End of recommended dictionary. ###### 00:08:14.779 Done 46 runs in 2 second(s) 00:08:14.779 19:06:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:14.779 19:06:33 -- ../common.sh@72 -- # (( i++ )) 00:08:14.779 19:06:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.779 19:06:33 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:14.779 19:06:33 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:14.779 19:06:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.779 19:06:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.779 19:06:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:14.779 19:06:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:14.779 19:06:33 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:14.779 19:06:33 -- nvmf/run.sh@29 -- # port=4417 00:08:14.779 19:06:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:14.779 19:06:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:14.779 19:06:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.039 19:06:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:15.039 [2024-11-18 19:06:33.409682] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:15.039 [2024-11-18 19:06:33.409745] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1306799 ] 00:08:15.039 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.039 [2024-11-18 19:06:33.585389] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.299 [2024-11-18 19:06:33.648858] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.299 [2024-11-18 19:06:33.648994] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.299 [2024-11-18 19:06:33.707604] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.299 [2024-11-18 19:06:33.723935] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:15.299 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.299 INFO: Seed: 3767162150 00:08:15.299 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:15.299 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:15.299 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:15.299 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.299 #2 INITED exec/s: 0 rss: 60Mb 00:08:15.299 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.299 This may also happen if the target rejected all inputs we tried so far 00:08:15.299 [2024-11-18 19:06:33.800375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-18 19:06:33.800421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.299 [2024-11-18 19:06:33.800542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-18 19:06:33.800573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.299 [2024-11-18 19:06:33.800683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-18 19:06:33.800709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.299 [2024-11-18 19:06:33.800825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.299 [2024-11-18 19:06:33.800852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.558 NEW_FUNC[1/672]: 0x4545f8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:15.558 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.558 #18 NEW cov: 11679 ft: 11685 corp: 2/107b lim: 120 exec/s: 0 rss: 68Mb L: 106/106 MS: 1 InsertRepeatedBytes- 00:08:15.558 [2024-11-18 19:06:34.121290] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-18 19:06:34.121347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.558 [2024-11-18 19:06:34.121459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-18 19:06:34.121485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.558 [2024-11-18 19:06:34.121615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-18 19:06:34.121647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.558 [2024-11-18 19:06:34.121781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:129 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.558 [2024-11-18 19:06:34.121810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.558 #19 NEW cov: 11797 ft: 12204 corp: 3/213b lim: 120 exec/s: 0 rss: 68Mb L: 106/106 MS: 1 ChangeBit- 00:08:15.819 [2024-11-18 19:06:34.171345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.819 [2024-11-18 19:06:34.171380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.819 [2024-11-18 19:06:34.171504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.819 [2024-11-18 19:06:34.171527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.819 [2024-11-18 19:06:34.171643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.819 [2024-11-18 19:06:34.171665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.171777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18383130728972943360 len:129 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.171799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.820 #20 NEW cov: 11803 ft: 12381 corp: 4/319b lim: 120 exec/s: 0 rss: 68Mb L: 106/106 MS: 1 CMP- DE: "\377\036"- 00:08:15.820 [2024-11-18 19:06:34.211274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.211308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.211405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.211427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.211540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.211563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.211684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.211705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.820 #21 NEW cov: 11888 ft: 12778 corp: 5/427b lim: 120 exec/s: 0 rss: 68Mb L: 108/108 MS: 1 PersAutoDict- DE: "\377\036"- 00:08:15.820 [2024-11-18 19:06:34.251427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:71809104410050560 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.251460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.251566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.251587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.251702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.251736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.251854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18383130728972943360 len:129 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.251879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.820 #22 NEW cov: 11888 ft: 12836 corp: 6/533b lim: 120 exec/s: 0 rss: 68Mb L: 106/108 MS: 1 PersAutoDict- DE: "\377\036"- 00:08:15.820 [2024-11-18 19:06:34.291570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.291599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.291685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.291707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.291822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.291845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.291964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.291985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.820 #23 NEW cov: 11888 ft: 12936 corp: 7/645b lim: 120 exec/s: 0 rss: 68Mb L: 112/112 MS: 1 InsertRepeatedBytes- 00:08:15.820 [2024-11-18 19:06:34.331725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.331761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.331887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.331912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.332027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.332048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.332168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.332188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.820 #24 NEW cov: 11888 ft: 13105 corp: 8/753b lim: 120 exec/s: 0 rss: 68Mb L: 108/112 MS: 1 PersAutoDict- DE: "\377\036"- 00:08:15.820 [2024-11-18 19:06:34.381832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.381862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.381986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5373952 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.382006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.382120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.382140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.820 [2024-11-18 19:06:34.382267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.820 [2024-11-18 19:06:34.382291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.820 #25 NEW cov: 11888 ft: 13187 corp: 9/863b lim: 120 exec/s: 0 rss: 68Mb L: 110/112 MS: 1 CMP- DE: "\001\000\000R"- 00:08:16.079 [2024-11-18 19:06:34.422028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.422060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.079 [2024-11-18 19:06:34.422148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.422168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.079 [2024-11-18 19:06:34.422280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.422299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.079 [2024-11-18 19:06:34.422415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.422437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.079 #26 NEW cov: 11888 ft: 13278 corp: 10/971b lim: 120 exec/s: 0 rss: 68Mb L: 108/112 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:16.079 [2024-11-18 19:06:34.461645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.461672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.079 [2024-11-18 19:06:34.461793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.461814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.079 #27 NEW cov: 11888 ft: 13789 corp: 11/1036b lim: 120 exec/s: 0 rss: 68Mb L: 65/112 MS: 1 EraseBytes- 00:08:16.079 [2024-11-18 19:06:34.502028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.502054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.079 [2024-11-18 19:06:34.502166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.502188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.079 [2024-11-18 19:06:34.502307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.502328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.079 #28 NEW cov: 11888 ft: 14106 corp: 12/1122b lim: 120 exec/s: 0 rss: 68Mb L: 86/112 MS: 1 EraseBytes- 00:08:16.079 [2024-11-18 19:06:34.542366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.542400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.079 [2024-11-18 19:06:34.542472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.079 [2024-11-18 19:06:34.542496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.542608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.542629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.542748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8388608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.542768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.080 #29 NEW cov: 11888 ft: 14136 corp: 13/1218b lim: 120 exec/s: 0 rss: 68Mb L: 96/112 MS: 1 EraseBytes- 00:08:16.080 [2024-11-18 19:06:34.582266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.582298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.582410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.582431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.582554] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.582577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.080 #30 NEW cov: 11888 ft: 14144 corp: 14/1303b lim: 120 exec/s: 0 rss: 68Mb L: 85/112 MS: 1 CrossOver- 00:08:16.080 [2024-11-18 19:06:34.622594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.622626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.622733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.622757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.622877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.622898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.623011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.623035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.080 #31 NEW cov: 11888 ft: 14169 corp: 15/1415b lim: 120 exec/s: 0 rss: 68Mb L: 112/112 MS: 1 InsertRepeatedBytes- 00:08:16.080 [2024-11-18 19:06:34.662436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.662470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.662609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.662631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.080 [2024-11-18 19:06:34.662749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.080 [2024-11-18 19:06:34.662772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.339 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.339 #32 NEW cov: 11911 ft: 14221 corp: 16/1500b lim: 120 exec/s: 0 rss: 69Mb L: 85/112 MS: 1 CopyPart- 00:08:16.339 [2024-11-18 19:06:34.712370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.712397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.339 [2024-11-18 19:06:34.712526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.712547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.339 #33 NEW cov: 11911 ft: 14249 corp: 17/1569b lim: 120 exec/s: 0 rss: 69Mb L: 69/112 MS: 1 EraseBytes- 00:08:16.339 [2024-11-18 19:06:34.752980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.753008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.339 [2024-11-18 19:06:34.753131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.753168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.339 [2024-11-18 19:06:34.753291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.753320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.339 [2024-11-18 19:06:34.753438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.753463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.339 #34 NEW cov: 11911 ft: 14279 corp: 18/1681b lim: 120 exec/s: 34 rss: 69Mb L: 112/112 MS: 1 ShuffleBytes- 00:08:16.339 [2024-11-18 19:06:34.803140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.803173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.339 [2024-11-18 19:06:34.803270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.803294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.339 [2024-11-18 19:06:34.803416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.803440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.339 [2024-11-18 19:06:34.803566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.803590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.339 #35 NEW cov: 11911 ft: 14298 corp: 19/1795b lim: 120 exec/s: 35 rss: 69Mb L: 114/114 MS: 1 CrossOver- 00:08:16.339 [2024-11-18 19:06:34.853319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.339 [2024-11-18 19:06:34.853354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.853462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.853487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.853608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.853633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.853750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.853773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.340 #36 NEW cov: 11911 ft: 14343 corp: 20/1901b lim: 120 exec/s: 36 rss: 69Mb L: 106/114 MS: 1 ChangeBit- 00:08:16.340 [2024-11-18 19:06:34.893350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.893382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.893501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:884763262976 len:52943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.893524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.893651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.893674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.340 #37 NEW cov: 11911 ft: 14356 corp: 21/1996b lim: 120 exec/s: 37 rss: 69Mb L: 95/114 MS: 1 InsertRepeatedBytes- 00:08:16.340 [2024-11-18 19:06:34.933562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:90159953477633 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.933590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.933760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.933787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.933876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.933898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.340 [2024-11-18 19:06:34.934017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.340 [2024-11-18 19:06:34.934035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.599 #38 NEW cov: 11911 ft: 14381 corp: 22/2106b lim: 120 exec/s: 38 rss: 69Mb L: 110/114 MS: 1 PersAutoDict- DE: "\001\000\000R"- 00:08:16.599 [2024-11-18 19:06:34.973440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:34.973476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:34.973602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:34.973629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:34.973756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:34.973778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.599 #39 NEW cov: 11911 ft: 14396 corp: 23/2191b lim: 120 exec/s: 39 rss: 70Mb L: 85/114 MS: 1 ShuffleBytes- 00:08:16.599 [2024-11-18 19:06:35.023884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.023920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.024044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.024069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.024186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.024210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.024327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.024355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.599 #40 NEW cov: 11911 ft: 14431 corp: 24/2303b lim: 120 exec/s: 40 rss: 70Mb L: 112/114 MS: 1 ChangeBit- 00:08:16.599 [2024-11-18 19:06:35.063223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.063251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.599 #41 NEW cov: 11911 ft: 15241 corp: 25/2333b lim: 120 exec/s: 41 rss: 70Mb L: 30/114 MS: 1 CrossOver- 00:08:16.599 [2024-11-18 19:06:35.114113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.114148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.114263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.114287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.114411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.114433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.114560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.114582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.599 #42 NEW cov: 11911 ft: 15255 corp: 26/2451b lim: 120 exec/s: 42 rss: 70Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:16.599 [2024-11-18 19:06:35.154192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.154226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.154344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.154368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.154484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.154507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.599 [2024-11-18 19:06:35.154634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.599 [2024-11-18 19:06:35.154657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.599 #43 NEW cov: 11911 ft: 15265 corp: 27/2570b lim: 120 exec/s: 43 rss: 70Mb L: 119/119 MS: 1 InsertByte- 00:08:16.859 [2024-11-18 19:06:35.204370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.204405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.204531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.204555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.204675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.204701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.204816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:129 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.204837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.859 #44 NEW cov: 11911 ft: 15275 corp: 28/2676b lim: 120 exec/s: 44 rss: 70Mb L: 106/119 MS: 1 CopyPart- 00:08:16.859 [2024-11-18 19:06:35.244043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.244071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.244186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.244211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.859 #45 NEW cov: 11911 ft: 15283 corp: 29/2742b lim: 120 exec/s: 45 rss: 70Mb L: 66/119 MS: 1 EraseBytes- 00:08:16.859 [2024-11-18 19:06:35.294660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.294692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.294793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.294813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.294933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.294955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.295073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8192 len:129 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.295091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.859 #46 NEW cov: 11911 ft: 15302 corp: 30/2848b lim: 120 exec/s: 46 rss: 70Mb L: 106/119 MS: 1 ChangeBit- 00:08:16.859 [2024-11-18 19:06:35.334776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.334808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.334890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.334913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.335026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:16385 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.335046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.335161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8388608 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.335186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.859 #47 NEW cov: 11911 ft: 15311 corp: 31/2944b lim: 120 exec/s: 47 rss: 70Mb L: 96/119 MS: 1 ChangeBit- 00:08:16.859 [2024-11-18 19:06:35.374421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.374453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.374564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.374587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.859 #48 NEW cov: 11911 ft: 15314 corp: 32/3009b lim: 120 exec/s: 48 rss: 70Mb L: 65/119 MS: 1 EraseBytes- 00:08:16.859 [2024-11-18 19:06:35.414792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.414826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.414943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:884763262976 len:52943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.414963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.415081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.415102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.859 #49 NEW cov: 11911 ft: 15324 corp: 33/3104b lim: 120 exec/s: 49 rss: 70Mb L: 95/119 MS: 1 ChangeBit- 00:08:16.859 [2024-11-18 19:06:35.455250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.455279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.455373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.455397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.455530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.455571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.859 [2024-11-18 19:06:35.455691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.859 [2024-11-18 19:06:35.455717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.118 #50 NEW cov: 11911 ft: 15331 corp: 34/3212b lim: 120 exec/s: 50 rss: 70Mb L: 108/119 MS: 1 ChangeBit- 00:08:17.118 [2024-11-18 19:06:35.494934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.494967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.495078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:884763262976 len:52943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.495103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.495219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:83 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.495243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.118 #51 NEW cov: 11911 ft: 15342 corp: 35/3307b lim: 120 exec/s: 51 rss: 70Mb L: 95/119 MS: 1 PersAutoDict- DE: "\001\000\000R"- 00:08:17.118 [2024-11-18 19:06:35.535365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.535398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.535512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.535534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.535655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.535678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.535794] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:32768 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.535820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.118 #52 NEW cov: 11911 ft: 15369 corp: 36/3404b lim: 120 exec/s: 52 rss: 70Mb L: 97/119 MS: 1 EraseBytes- 00:08:17.118 [2024-11-18 19:06:35.575509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.575544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.575670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5373952 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.575691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.575804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.575827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.575942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.575962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.118 #53 NEW cov: 11911 ft: 15432 corp: 37/3514b lim: 120 exec/s: 53 rss: 70Mb L: 110/119 MS: 1 ChangeBinInt- 00:08:17.118 [2024-11-18 19:06:35.615117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.615144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.615263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.615283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.118 #54 NEW cov: 11911 ft: 15441 corp: 38/3583b lim: 120 exec/s: 54 rss: 70Mb L: 69/119 MS: 1 ShuffleBytes- 00:08:17.118 [2024-11-18 19:06:35.655772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:65310 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.655805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.655913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.655931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.656047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.656069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.656183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.656204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.118 #55 NEW cov: 11911 ft: 15479 corp: 39/3693b lim: 120 exec/s: 55 rss: 70Mb L: 110/119 MS: 1 PersAutoDict- DE: "\377\036"- 00:08:17.118 [2024-11-18 19:06:35.695444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.695471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.695590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:884763262976 len:52943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.695607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.118 [2024-11-18 19:06:35.695727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:83 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.118 [2024-11-18 19:06:35.695752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.377 #56 NEW cov: 11911 ft: 15481 corp: 40/3788b lim: 120 exec/s: 56 rss: 70Mb L: 95/119 MS: 1 ChangeByte- 00:08:17.377 [2024-11-18 19:06:35.735966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7680 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.735998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.377 [2024-11-18 19:06:35.736093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.736114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.377 [2024-11-18 19:06:35.736238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.736262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.377 [2024-11-18 19:06:35.736369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.736391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.377 #57 NEW cov: 11911 ft: 15483 corp: 41/3898b lim: 120 exec/s: 57 rss: 70Mb L: 110/119 MS: 1 CopyPart- 00:08:17.377 [2024-11-18 19:06:35.776116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.776150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.377 [2024-11-18 19:06:35.776260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.776283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.377 [2024-11-18 19:06:35.776403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:618475290624 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.776427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.377 [2024-11-18 19:06:35.776542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.377 [2024-11-18 19:06:35.776567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.377 #58 NEW cov: 11911 ft: 15501 corp: 42/4006b lim: 120 exec/s: 29 rss: 70Mb L: 108/119 MS: 1 ChangeByte- 00:08:17.377 #58 DONE cov: 11911 ft: 15501 corp: 42/4006b lim: 120 exec/s: 29 rss: 70Mb 00:08:17.377 ###### Recommended dictionary. ###### 00:08:17.377 "\377\036" # Uses: 4 00:08:17.377 "\001\000\000R" # Uses: 2 00:08:17.377 "\000\000\000\000" # Uses: 0 00:08:17.377 ###### End of recommended dictionary. ###### 00:08:17.377 Done 58 runs in 2 second(s) 00:08:17.377 19:06:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:17.377 19:06:35 -- ../common.sh@72 -- # (( i++ )) 00:08:17.377 19:06:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.377 19:06:35 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:17.377 19:06:35 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:17.377 19:06:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.377 19:06:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.377 19:06:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:17.377 19:06:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:17.377 19:06:35 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:17.377 19:06:35 -- nvmf/run.sh@29 -- # port=4418 00:08:17.377 19:06:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:17.377 19:06:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:17.377 19:06:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.377 19:06:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:17.377 [2024-11-18 19:06:35.958942] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:17.377 [2024-11-18 19:06:35.959018] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1307300 ] 00:08:17.636 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.636 [2024-11-18 19:06:36.139421] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.636 [2024-11-18 19:06:36.206352] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.636 [2024-11-18 19:06:36.206469] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.894 [2024-11-18 19:06:36.264663] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:17.894 [2024-11-18 19:06:36.280976] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:17.894 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.894 INFO: Seed: 2028188464 00:08:17.894 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:17.894 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:17.894 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:17.894 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.894 #2 INITED exec/s: 0 rss: 60Mb 00:08:17.894 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.894 This may also happen if the target rejected all inputs we tried so far 00:08:17.894 [2024-11-18 19:06:36.326240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:17.894 [2024-11-18 19:06:36.326270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.894 [2024-11-18 19:06:36.326324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:17.894 [2024-11-18 19:06:36.326340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.894 [2024-11-18 19:06:36.326392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:17.894 [2024-11-18 19:06:36.326406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.152 NEW_FUNC[1/670]: 0x457e58 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:18.152 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.152 #4 NEW cov: 11626 ft: 11619 corp: 2/78b lim: 100 exec/s: 0 rss: 68Mb L: 77/77 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:18.152 [2024-11-18 19:06:36.646909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.152 [2024-11-18 19:06:36.646941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.152 [2024-11-18 19:06:36.646996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.152 [2024-11-18 19:06:36.647010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.152 #20 NEW cov: 11741 ft: 12472 corp: 3/137b lim: 100 exec/s: 0 rss: 69Mb L: 59/77 MS: 1 CrossOver- 00:08:18.152 [2024-11-18 19:06:36.687083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.152 [2024-11-18 19:06:36.687111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.152 [2024-11-18 19:06:36.687158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.152 [2024-11-18 19:06:36.687173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.152 [2024-11-18 19:06:36.687223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.153 [2024-11-18 19:06:36.687237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.153 #26 NEW cov: 11747 ft: 12677 corp: 4/214b lim: 100 exec/s: 0 rss: 69Mb L: 77/77 MS: 1 ChangeByte- 00:08:18.153 [2024-11-18 19:06:36.727159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.153 [2024-11-18 19:06:36.727187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.153 [2024-11-18 19:06:36.727220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.153 [2024-11-18 19:06:36.727235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.153 [2024-11-18 19:06:36.727288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.153 [2024-11-18 19:06:36.727303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.153 #27 NEW cov: 11832 ft: 12970 corp: 5/291b lim: 100 exec/s: 0 rss: 69Mb L: 77/77 MS: 1 ShuffleBytes- 00:08:18.411 [2024-11-18 19:06:36.767280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.411 [2024-11-18 19:06:36.767306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.767347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.411 [2024-11-18 19:06:36.767361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.767410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.411 [2024-11-18 19:06:36.767424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.411 #42 NEW cov: 11832 ft: 13043 corp: 6/357b lim: 100 exec/s: 0 rss: 69Mb L: 66/77 MS: 5 InsertByte-ChangeByte-ChangeBit-ChangeBinInt-InsertRepeatedBytes- 00:08:18.411 [2024-11-18 19:06:36.807171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.411 [2024-11-18 19:06:36.807197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.411 #45 NEW cov: 11832 ft: 13483 corp: 7/395b lim: 100 exec/s: 0 rss: 69Mb L: 38/77 MS: 3 ChangeBinInt-ChangeBit-CrossOver- 00:08:18.411 [2024-11-18 19:06:36.847482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.411 [2024-11-18 19:06:36.847508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.847547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.411 [2024-11-18 19:06:36.847565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.847617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.411 [2024-11-18 19:06:36.847631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.411 #46 NEW cov: 11832 ft: 13547 corp: 8/461b lim: 100 exec/s: 0 rss: 69Mb L: 66/77 MS: 1 CopyPart- 00:08:18.411 [2024-11-18 19:06:36.887648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.411 [2024-11-18 19:06:36.887674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.887712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.411 [2024-11-18 19:06:36.887725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.887779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.411 [2024-11-18 19:06:36.887793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.411 #47 NEW cov: 11832 ft: 13589 corp: 9/527b lim: 100 exec/s: 0 rss: 69Mb L: 66/77 MS: 1 ChangeBit- 00:08:18.411 [2024-11-18 19:06:36.927732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.411 [2024-11-18 19:06:36.927758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.927794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.411 [2024-11-18 19:06:36.927812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.927862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.411 [2024-11-18 19:06:36.927876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.411 #48 NEW cov: 11832 ft: 13693 corp: 10/605b lim: 100 exec/s: 0 rss: 69Mb L: 78/78 MS: 1 InsertByte- 00:08:18.411 [2024-11-18 19:06:36.967853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.411 [2024-11-18 19:06:36.967879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.967914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.411 [2024-11-18 19:06:36.967927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.411 [2024-11-18 19:06:36.967979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.411 [2024-11-18 19:06:36.967993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.411 #49 NEW cov: 11832 ft: 13720 corp: 11/683b lim: 100 exec/s: 0 rss: 69Mb L: 78/78 MS: 1 InsertByte- 00:08:18.411 [2024-11-18 19:06:37.007998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.412 [2024-11-18 19:06:37.008024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.412 [2024-11-18 19:06:37.008058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.412 [2024-11-18 19:06:37.008072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.412 [2024-11-18 19:06:37.008123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.412 [2024-11-18 19:06:37.008136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.671 #50 NEW cov: 11832 ft: 13867 corp: 12/761b lim: 100 exec/s: 0 rss: 69Mb L: 78/78 MS: 1 InsertByte- 00:08:18.671 [2024-11-18 19:06:37.048084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.671 [2024-11-18 19:06:37.048110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.048147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.671 [2024-11-18 19:06:37.048162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.048215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.671 [2024-11-18 19:06:37.048230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.671 #51 NEW cov: 11832 ft: 13884 corp: 13/839b lim: 100 exec/s: 0 rss: 69Mb L: 78/78 MS: 1 ChangeBinInt- 00:08:18.671 [2024-11-18 19:06:37.088222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.671 [2024-11-18 19:06:37.088248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.088286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.671 [2024-11-18 19:06:37.088298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.088353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.671 [2024-11-18 19:06:37.088368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.671 #52 NEW cov: 11832 ft: 13954 corp: 14/916b lim: 100 exec/s: 0 rss: 69Mb L: 77/78 MS: 1 ShuffleBytes- 00:08:18.671 [2024-11-18 19:06:37.128297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.671 [2024-11-18 19:06:37.128322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.128363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.671 [2024-11-18 19:06:37.128378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.128430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.671 [2024-11-18 19:06:37.128445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.671 #53 NEW cov: 11832 ft: 14005 corp: 15/993b lim: 100 exec/s: 0 rss: 69Mb L: 77/78 MS: 1 ChangeByte- 00:08:18.671 [2024-11-18 19:06:37.168431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.671 [2024-11-18 19:06:37.168457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.168494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.671 [2024-11-18 19:06:37.168508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.168563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.671 [2024-11-18 19:06:37.168578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.671 #54 NEW cov: 11832 ft: 14009 corp: 16/1070b lim: 100 exec/s: 0 rss: 69Mb L: 77/78 MS: 1 CopyPart- 00:08:18.671 [2024-11-18 19:06:37.198389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.671 [2024-11-18 19:06:37.198416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.198467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.671 [2024-11-18 19:06:37.198482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.671 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:18.671 #55 NEW cov: 11855 ft: 14079 corp: 17/1116b lim: 100 exec/s: 0 rss: 69Mb L: 46/78 MS: 1 CMP- DE: "\001\000\177\240\004\000\233\370"- 00:08:18.671 [2024-11-18 19:06:37.238789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.671 [2024-11-18 19:06:37.238816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.238870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.671 [2024-11-18 19:06:37.238886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.238937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.671 [2024-11-18 19:06:37.238953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.671 [2024-11-18 19:06:37.239004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.671 [2024-11-18 19:06:37.239022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.671 #57 NEW cov: 11855 ft: 14368 corp: 18/1202b lim: 100 exec/s: 0 rss: 69Mb L: 86/86 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:18.930 [2024-11-18 19:06:37.278844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.930 [2024-11-18 19:06:37.278872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.278911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.931 [2024-11-18 19:06:37.278926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.278977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.931 [2024-11-18 19:06:37.278993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.279044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.931 [2024-11-18 19:06:37.279058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.931 #58 NEW cov: 11855 ft: 14417 corp: 19/1288b lim: 100 exec/s: 0 rss: 70Mb L: 86/86 MS: 1 ChangeByte- 00:08:18.931 [2024-11-18 19:06:37.318987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.931 [2024-11-18 19:06:37.319012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.319049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.931 [2024-11-18 19:06:37.319065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.319117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.931 [2024-11-18 19:06:37.319132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.319184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.931 [2024-11-18 19:06:37.319199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.931 #59 NEW cov: 11855 ft: 14442 corp: 20/1385b lim: 100 exec/s: 59 rss: 70Mb L: 97/97 MS: 1 InsertRepeatedBytes- 00:08:18.931 [2024-11-18 19:06:37.358963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.931 [2024-11-18 19:06:37.358990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.359027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.931 [2024-11-18 19:06:37.359041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.359094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.931 [2024-11-18 19:06:37.359108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.931 #60 NEW cov: 11855 ft: 14458 corp: 21/1464b lim: 100 exec/s: 60 rss: 70Mb L: 79/97 MS: 1 InsertByte- 00:08:18.931 [2024-11-18 19:06:37.399116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.931 [2024-11-18 19:06:37.399142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.399184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.931 [2024-11-18 19:06:37.399199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.399251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.931 [2024-11-18 19:06:37.399266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.931 #61 NEW cov: 11855 ft: 14532 corp: 22/1542b lim: 100 exec/s: 61 rss: 70Mb L: 78/97 MS: 1 ChangeByte- 00:08:18.931 [2024-11-18 19:06:37.439111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.931 [2024-11-18 19:06:37.439138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.439193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.931 [2024-11-18 19:06:37.439209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.931 #62 NEW cov: 11855 ft: 14540 corp: 23/1601b lim: 100 exec/s: 62 rss: 70Mb L: 59/97 MS: 1 ChangeBit- 00:08:18.931 [2024-11-18 19:06:37.479473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.931 [2024-11-18 19:06:37.479499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.479547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.931 [2024-11-18 19:06:37.479568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.479620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.931 [2024-11-18 19:06:37.479636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.479689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.931 [2024-11-18 19:06:37.479704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.931 #63 NEW cov: 11855 ft: 14543 corp: 24/1690b lim: 100 exec/s: 63 rss: 70Mb L: 89/97 MS: 1 CopyPart- 00:08:18.931 [2024-11-18 19:06:37.519450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.931 [2024-11-18 19:06:37.519476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.519518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.931 [2024-11-18 19:06:37.519532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.931 [2024-11-18 19:06:37.519590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.931 [2024-11-18 19:06:37.519605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.190 #64 NEW cov: 11855 ft: 14548 corp: 25/1765b lim: 100 exec/s: 64 rss: 70Mb L: 75/97 MS: 1 CrossOver- 00:08:19.190 [2024-11-18 19:06:37.559595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.190 [2024-11-18 19:06:37.559621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.190 [2024-11-18 19:06:37.559657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.190 [2024-11-18 19:06:37.559672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.190 [2024-11-18 19:06:37.559732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.190 [2024-11-18 19:06:37.559748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.190 #65 NEW cov: 11855 ft: 14561 corp: 26/1842b lim: 100 exec/s: 65 rss: 70Mb L: 77/97 MS: 1 ChangeByte- 00:08:19.190 [2024-11-18 19:06:37.599830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.190 [2024-11-18 19:06:37.599858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.190 [2024-11-18 19:06:37.599896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.190 [2024-11-18 19:06:37.599911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.190 [2024-11-18 19:06:37.599962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.190 [2024-11-18 19:06:37.599977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.190 [2024-11-18 19:06:37.600030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.190 [2024-11-18 19:06:37.600046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.190 #66 NEW cov: 11855 ft: 14576 corp: 27/1928b lim: 100 exec/s: 66 rss: 70Mb L: 86/97 MS: 1 ShuffleBytes- 00:08:19.190 [2024-11-18 19:06:37.639802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.190 [2024-11-18 19:06:37.639828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.190 [2024-11-18 19:06:37.639868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.190 [2024-11-18 19:06:37.639882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.190 [2024-11-18 19:06:37.639936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.190 [2024-11-18 19:06:37.639950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.190 #67 NEW cov: 11855 ft: 14583 corp: 28/2006b lim: 100 exec/s: 67 rss: 70Mb L: 78/97 MS: 1 ShuffleBytes- 00:08:19.191 [2024-11-18 19:06:37.669874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.191 [2024-11-18 19:06:37.669900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.669944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.191 [2024-11-18 19:06:37.669959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.670012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.191 [2024-11-18 19:06:37.670026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 #68 NEW cov: 11855 ft: 14619 corp: 29/2084b lim: 100 exec/s: 68 rss: 70Mb L: 78/97 MS: 1 CrossOver- 00:08:19.191 [2024-11-18 19:06:37.710006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.191 [2024-11-18 19:06:37.710031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.710073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.191 [2024-11-18 19:06:37.710090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.710142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.191 [2024-11-18 19:06:37.710157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 #69 NEW cov: 11855 ft: 14681 corp: 30/2160b lim: 100 exec/s: 69 rss: 70Mb L: 76/97 MS: 1 InsertByte- 00:08:19.191 [2024-11-18 19:06:37.750282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.191 [2024-11-18 19:06:37.750308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.750356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.191 [2024-11-18 19:06:37.750370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.750410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.191 [2024-11-18 19:06:37.750424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.750479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.191 [2024-11-18 19:06:37.750494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.191 #70 NEW cov: 11855 ft: 14695 corp: 31/2240b lim: 100 exec/s: 70 rss: 70Mb L: 80/97 MS: 1 CrossOver- 00:08:19.191 [2024-11-18 19:06:37.790279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.191 [2024-11-18 19:06:37.790305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.790343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.191 [2024-11-18 19:06:37.790358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.191 [2024-11-18 19:06:37.790412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.191 [2024-11-18 19:06:37.790426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 #71 NEW cov: 11855 ft: 14754 corp: 32/2318b lim: 100 exec/s: 71 rss: 70Mb L: 78/97 MS: 1 ChangeByte- 00:08:19.450 [2024-11-18 19:06:37.830406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.450 [2024-11-18 19:06:37.830433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-11-18 19:06:37.830470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.450 [2024-11-18 19:06:37.830484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-11-18 19:06:37.830539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.450 [2024-11-18 19:06:37.830559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 #72 NEW cov: 11855 ft: 14759 corp: 33/2396b lim: 100 exec/s: 72 rss: 70Mb L: 78/97 MS: 1 ChangeBinInt- 00:08:19.450 [2024-11-18 19:06:37.870557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.450 [2024-11-18 19:06:37.870583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-11-18 19:06:37.870632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.450 [2024-11-18 19:06:37.870650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.450 [2024-11-18 19:06:37.870699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.450 [2024-11-18 19:06:37.870712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.450 [2024-11-18 19:06:37.870763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.450 [2024-11-18 19:06:37.870778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.450 #73 NEW cov: 11855 ft: 14770 corp: 34/2494b lim: 100 exec/s: 73 rss: 70Mb L: 98/98 MS: 1 CrossOver- 00:08:19.450 [2024-11-18 19:06:37.910553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.450 [2024-11-18 19:06:37.910579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.450 [2024-11-18 19:06:37.910626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.450 [2024-11-18 19:06:37.910641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:37.910695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.451 [2024-11-18 19:06:37.910710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.451 #74 NEW cov: 11855 ft: 14800 corp: 35/2573b lim: 100 exec/s: 74 rss: 70Mb L: 79/98 MS: 1 InsertByte- 00:08:19.451 [2024-11-18 19:06:37.950651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.451 [2024-11-18 19:06:37.950678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:37.950714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.451 [2024-11-18 19:06:37.950728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:37.950780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.451 [2024-11-18 19:06:37.950795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.451 #75 NEW cov: 11855 ft: 14851 corp: 36/2651b lim: 100 exec/s: 75 rss: 70Mb L: 78/98 MS: 1 InsertByte- 00:08:19.451 [2024-11-18 19:06:37.990847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.451 [2024-11-18 19:06:37.990874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:37.990916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.451 [2024-11-18 19:06:37.990932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:37.990994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.451 [2024-11-18 19:06:37.991008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.451 #76 NEW cov: 11855 ft: 14884 corp: 37/2729b lim: 100 exec/s: 76 rss: 70Mb L: 78/98 MS: 1 ShuffleBytes- 00:08:19.451 [2024-11-18 19:06:38.031046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.451 [2024-11-18 19:06:38.031072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:38.031112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.451 [2024-11-18 19:06:38.031127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:38.031180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.451 [2024-11-18 19:06:38.031193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.451 [2024-11-18 19:06:38.031245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.451 [2024-11-18 19:06:38.031260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.451 #77 NEW cov: 11855 ft: 14886 corp: 38/2821b lim: 100 exec/s: 77 rss: 70Mb L: 92/98 MS: 1 InsertRepeatedBytes- 00:08:19.710 [2024-11-18 19:06:38.071050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.710 [2024-11-18 19:06:38.071075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.071122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.710 [2024-11-18 19:06:38.071136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.071188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.710 [2024-11-18 19:06:38.071203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.710 #78 NEW cov: 11855 ft: 14902 corp: 39/2899b lim: 100 exec/s: 78 rss: 70Mb L: 78/98 MS: 1 ChangeBinInt- 00:08:19.710 [2024-11-18 19:06:38.111164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.710 [2024-11-18 19:06:38.111189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.111226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.710 [2024-11-18 19:06:38.111241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.111292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.710 [2024-11-18 19:06:38.111306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.710 #79 NEW cov: 11855 ft: 14922 corp: 40/2977b lim: 100 exec/s: 79 rss: 70Mb L: 78/98 MS: 1 ChangeBinInt- 00:08:19.710 [2024-11-18 19:06:38.151252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.710 [2024-11-18 19:06:38.151277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.151321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.710 [2024-11-18 19:06:38.151335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.151387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.710 [2024-11-18 19:06:38.151401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.710 #80 NEW cov: 11855 ft: 14930 corp: 41/3055b lim: 100 exec/s: 80 rss: 70Mb L: 78/98 MS: 1 ChangeASCIIInt- 00:08:19.710 [2024-11-18 19:06:38.191498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.710 [2024-11-18 19:06:38.191524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.191571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.710 [2024-11-18 19:06:38.191584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.191635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.710 [2024-11-18 19:06:38.191649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.191701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.710 [2024-11-18 19:06:38.191714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.710 #81 NEW cov: 11855 ft: 14955 corp: 42/3141b lim: 100 exec/s: 81 rss: 70Mb L: 86/98 MS: 1 ChangeBinInt- 00:08:19.710 [2024-11-18 19:06:38.231467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.710 [2024-11-18 19:06:38.231493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.231530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.710 [2024-11-18 19:06:38.231545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.231604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.710 [2024-11-18 19:06:38.231619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.710 #82 NEW cov: 11855 ft: 14960 corp: 43/3218b lim: 100 exec/s: 82 rss: 70Mb L: 77/98 MS: 1 EraseBytes- 00:08:19.710 [2024-11-18 19:06:38.271612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.710 [2024-11-18 19:06:38.271637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.271682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.710 [2024-11-18 19:06:38.271696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.710 [2024-11-18 19:06:38.271747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.710 [2024-11-18 19:06:38.271761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.710 #83 NEW cov: 11855 ft: 15000 corp: 44/3296b lim: 100 exec/s: 83 rss: 70Mb L: 78/98 MS: 1 ChangeByte- 00:08:19.969 [2024-11-18 19:06:38.311765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.969 [2024-11-18 19:06:38.311792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.969 [2024-11-18 19:06:38.311827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.969 [2024-11-18 19:06:38.311842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.969 [2024-11-18 19:06:38.311895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.969 [2024-11-18 19:06:38.311909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.969 #84 NEW cov: 11855 ft: 15012 corp: 45/3372b lim: 100 exec/s: 42 rss: 70Mb L: 76/98 MS: 1 ShuffleBytes- 00:08:19.969 #84 DONE cov: 11855 ft: 15012 corp: 45/3372b lim: 100 exec/s: 42 rss: 70Mb 00:08:19.969 ###### Recommended dictionary. ###### 00:08:19.969 "\001\000\177\240\004\000\233\370" # Uses: 0 00:08:19.969 ###### End of recommended dictionary. ###### 00:08:19.969 Done 84 runs in 2 second(s) 00:08:19.969 19:06:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:19.969 19:06:38 -- ../common.sh@72 -- # (( i++ )) 00:08:19.969 19:06:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.969 19:06:38 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:19.969 19:06:38 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:19.969 19:06:38 -- nvmf/run.sh@24 -- # local timen=1 00:08:19.969 19:06:38 -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.969 19:06:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:19.969 19:06:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:19.969 19:06:38 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:19.969 19:06:38 -- nvmf/run.sh@29 -- # port=4419 00:08:19.970 19:06:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:19.970 19:06:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:19.970 19:06:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.970 19:06:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:19.970 [2024-11-18 19:06:38.498983] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:19.970 [2024-11-18 19:06:38.499058] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1307847 ] 00:08:19.970 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.228 [2024-11-18 19:06:38.678303] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.228 [2024-11-18 19:06:38.741451] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.228 [2024-11-18 19:06:38.741573] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.228 [2024-11-18 19:06:38.799480] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.228 [2024-11-18 19:06:38.815805] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:20.228 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.228 INFO: Seed: 267228054 00:08:20.487 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:20.487 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:20.487 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.487 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.487 #2 INITED exec/s: 0 rss: 61Mb 00:08:20.487 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.487 This may also happen if the target rejected all inputs we tried so far 00:08:20.487 [2024-11-18 19:06:38.864840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:20.487 [2024-11-18 19:06:38.864873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.745 NEW_FUNC[1/670]: 0x45ae18 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:20.745 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:20.745 #3 NEW cov: 11599 ft: 11600 corp: 2/19b lim: 50 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 InsertRepeatedBytes- 00:08:20.745 [2024-11-18 19:06:39.165583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:20.745 [2024-11-18 19:06:39.165619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.745 #9 NEW cov: 11719 ft: 12070 corp: 3/38b lim: 50 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CrossOver- 00:08:20.745 [2024-11-18 19:06:39.215648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10995284049920 len:1 00:08:20.745 [2024-11-18 19:06:39.215677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.745 #10 NEW cov: 11725 ft: 12316 corp: 4/57b lim: 50 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CopyPart- 00:08:20.745 [2024-11-18 19:06:39.255758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11076888428544 len:1 00:08:20.745 [2024-11-18 19:06:39.255787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.745 #11 NEW cov: 11810 ft: 12682 corp: 5/76b lim: 50 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 ChangeBinInt- 00:08:20.745 [2024-11-18 19:06:39.295868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:20.745 [2024-11-18 19:06:39.295898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.745 #12 NEW cov: 11810 ft: 12784 corp: 6/95b lim: 50 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 ShuffleBytes- 00:08:20.745 [2024-11-18 19:06:39.335973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:20.746 [2024-11-18 19:06:39.336001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.004 #17 NEW cov: 11810 ft: 12846 corp: 7/109b lim: 50 exec/s: 0 rss: 68Mb L: 14/19 MS: 5 CopyPart-ChangeBit-ChangeByte-ShuffleBytes-CrossOver- 00:08:21.004 [2024-11-18 19:06:39.376191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:21.004 [2024-11-18 19:06:39.376220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.004 [2024-11-18 19:06:39.376286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:21.004 [2024-11-18 19:06:39.376308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.004 #18 NEW cov: 11810 ft: 13229 corp: 8/135b lim: 50 exec/s: 0 rss: 69Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:08:21.004 [2024-11-18 19:06:39.416324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:21.004 [2024-11-18 19:06:39.416353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.004 [2024-11-18 19:06:39.416415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2533274790395914 len:1 00:08:21.004 [2024-11-18 19:06:39.416437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.004 #22 NEW cov: 11810 ft: 13262 corp: 9/156b lim: 50 exec/s: 0 rss: 69Mb L: 21/26 MS: 4 EraseBytes-ChangeBinInt-ShuffleBytes-CrossOver- 00:08:21.004 [2024-11-18 19:06:39.456454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:21.004 [2024-11-18 19:06:39.456483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.004 [2024-11-18 19:06:39.456541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:21.004 [2024-11-18 19:06:39.456571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.004 #23 NEW cov: 11810 ft: 13296 corp: 10/182b lim: 50 exec/s: 0 rss: 69Mb L: 26/26 MS: 1 CopyPart- 00:08:21.004 [2024-11-18 19:06:39.496579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:181093269504 len:1 00:08:21.004 [2024-11-18 19:06:39.496609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.004 [2024-11-18 19:06:39.496664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:21.004 [2024-11-18 19:06:39.496686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.004 #24 NEW cov: 11810 ft: 13317 corp: 11/210b lim: 50 exec/s: 0 rss: 69Mb L: 28/28 MS: 1 CopyPart- 00:08:21.004 [2024-11-18 19:06:39.536594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11076888428544 len:1 00:08:21.004 [2024-11-18 19:06:39.536624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.004 #25 NEW cov: 11810 ft: 13383 corp: 12/229b lim: 50 exec/s: 0 rss: 69Mb L: 19/28 MS: 1 ChangeByte- 00:08:21.004 [2024-11-18 19:06:39.576713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643082 len:1 00:08:21.004 [2024-11-18 19:06:39.576743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.004 #26 NEW cov: 11810 ft: 13410 corp: 13/243b lim: 50 exec/s: 0 rss: 69Mb L: 14/28 MS: 1 CrossOver- 00:08:21.263 [2024-11-18 19:06:39.616917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:181093269504 len:1 00:08:21.263 [2024-11-18 19:06:39.616946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.263 [2024-11-18 19:06:39.617016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:21.263 [2024-11-18 19:06:39.617037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.263 #27 NEW cov: 11810 ft: 13439 corp: 14/271b lim: 50 exec/s: 0 rss: 69Mb L: 28/28 MS: 1 ChangeByte- 00:08:21.263 [2024-11-18 19:06:39.656950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35975009466493697 len:4614 00:08:21.263 [2024-11-18 19:06:39.656979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.263 #32 NEW cov: 11810 ft: 13479 corp: 15/282b lim: 50 exec/s: 0 rss: 69Mb L: 11/28 MS: 5 CopyPart-CopyPart-InsertByte-ChangeByte-CMP- DE: "\001\000\177\317\024\022\005Q"- 00:08:21.263 [2024-11-18 19:06:39.687179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:21.263 [2024-11-18 19:06:39.687209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.263 [2024-11-18 19:06:39.687278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10 len:11 00:08:21.263 [2024-11-18 19:06:39.687301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.263 #33 NEW cov: 11810 ft: 13498 corp: 16/302b lim: 50 exec/s: 0 rss: 69Mb L: 20/28 MS: 1 EraseBytes- 00:08:21.263 [2024-11-18 19:06:39.727564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:21.263 [2024-11-18 19:06:39.727593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.263 [2024-11-18 19:06:39.727655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2533274790395914 len:51 00:08:21.264 [2024-11-18 19:06:39.727677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.264 [2024-11-18 19:06:39.727744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3617008641903833650 len:12851 00:08:21.264 [2024-11-18 19:06:39.727765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.264 [2024-11-18 19:06:39.727829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3617008641903833650 len:12851 00:08:21.264 [2024-11-18 19:06:39.727848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.264 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.264 #34 NEW cov: 11833 ft: 13907 corp: 17/344b lim: 50 exec/s: 0 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:08:21.264 [2024-11-18 19:06:39.767306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35975009466459137 len:4614 00:08:21.264 [2024-11-18 19:06:39.767336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.264 #35 NEW cov: 11833 ft: 13934 corp: 18/355b lim: 50 exec/s: 0 rss: 69Mb L: 11/42 MS: 1 ChangeByte- 00:08:21.264 [2024-11-18 19:06:39.807839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643072 len:1 00:08:21.264 [2024-11-18 19:06:39.807868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.264 [2024-11-18 19:06:39.807933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2752512 len:1 00:08:21.264 [2024-11-18 19:06:39.807955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.264 [2024-11-18 19:06:39.808021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:720575940379279360 len:37 00:08:21.264 [2024-11-18 19:06:39.808041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.264 [2024-11-18 19:06:39.808106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:3617008641903833650 len:12851 00:08:21.264 [2024-11-18 19:06:39.808125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.264 #36 NEW cov: 11833 ft: 13964 corp: 19/397b lim: 50 exec/s: 0 rss: 69Mb L: 42/42 MS: 1 CrossOver- 00:08:21.264 [2024-11-18 19:06:39.847569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10995284052480 len:1 00:08:21.264 [2024-11-18 19:06:39.847598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 #37 NEW cov: 11833 ft: 13978 corp: 20/416b lim: 50 exec/s: 37 rss: 69Mb L: 19/42 MS: 1 CrossOver- 00:08:21.524 [2024-11-18 19:06:39.887629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:21.524 [2024-11-18 19:06:39.887658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 #38 NEW cov: 11833 ft: 14000 corp: 21/435b lim: 50 exec/s: 38 rss: 69Mb L: 19/42 MS: 1 ChangeBinInt- 00:08:21.524 [2024-11-18 19:06:39.917725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11076888428544 len:1 00:08:21.524 [2024-11-18 19:06:39.917754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 #39 NEW cov: 11833 ft: 14100 corp: 22/453b lim: 50 exec/s: 39 rss: 69Mb L: 18/42 MS: 1 EraseBytes- 00:08:21.524 [2024-11-18 19:06:39.957975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10999579017216 len:128 00:08:21.524 [2024-11-18 19:06:39.958009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 [2024-11-18 19:06:39.958081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5836665120546361861 len:1 00:08:21.524 [2024-11-18 19:06:39.958103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.524 #40 NEW cov: 11833 ft: 14133 corp: 23/480b lim: 50 exec/s: 40 rss: 69Mb L: 27/42 MS: 1 PersAutoDict- DE: "\001\000\177\317\024\022\005Q"- 00:08:21.524 [2024-11-18 19:06:39.998074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35975007016321025 len:4614 00:08:21.524 [2024-11-18 19:06:39.998104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 [2024-11-18 19:06:39.998174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1358954496 len:1 00:08:21.524 [2024-11-18 19:06:39.998196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.524 #41 NEW cov: 11833 ft: 14144 corp: 24/506b lim: 50 exec/s: 41 rss: 69Mb L: 26/42 MS: 1 PersAutoDict- DE: "\001\000\177\317\024\022\005Q"- 00:08:21.524 [2024-11-18 19:06:40.038260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:169476096 len:6 00:08:21.524 [2024-11-18 19:06:40.038291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 [2024-11-18 19:06:40.038360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:1358954496 len:1 00:08:21.524 [2024-11-18 19:06:40.038381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.524 #42 NEW cov: 11833 ft: 14157 corp: 25/532b lim: 50 exec/s: 42 rss: 70Mb L: 26/42 MS: 1 ChangeBinInt- 00:08:21.524 [2024-11-18 19:06:40.078242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:106398054915681065 len:1362 00:08:21.524 [2024-11-18 19:06:40.078272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 #51 NEW cov: 11833 ft: 14169 corp: 26/542b lim: 50 exec/s: 51 rss: 70Mb L: 10/42 MS: 4 EraseBytes-InsertByte-ChangeBit-InsertByte- 00:08:21.524 [2024-11-18 19:06:40.118481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:35975007016321025 len:61179 00:08:21.524 [2024-11-18 19:06:40.118513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.524 [2024-11-18 19:06:40.118588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18444492276831879167 len:1 00:08:21.524 [2024-11-18 19:06:40.118611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.783 #52 NEW cov: 11833 ft: 14174 corp: 27/568b lim: 50 exec/s: 52 rss: 70Mb L: 26/42 MS: 1 ChangeBinInt- 00:08:21.783 [2024-11-18 19:06:40.158614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:21.783 [2024-11-18 19:06:40.158643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.783 [2024-11-18 19:06:40.158712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967296 len:128 00:08:21.783 [2024-11-18 19:06:40.158735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.783 #53 NEW cov: 11833 ft: 14189 corp: 28/594b lim: 50 exec/s: 53 rss: 70Mb L: 26/42 MS: 1 PersAutoDict- DE: "\001\000\177\317\024\022\005Q"- 00:08:21.783 [2024-11-18 19:06:40.198763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3399988123302625327 len:12080 00:08:21.783 [2024-11-18 19:06:40.198792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.783 [2024-11-18 19:06:40.198856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:46180279975936 len:1 00:08:21.783 [2024-11-18 19:06:40.198878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.783 [2024-11-18 19:06:40.198943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:11 00:08:21.783 [2024-11-18 19:06:40.198963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.783 #54 NEW cov: 11833 ft: 14429 corp: 29/631b lim: 50 exec/s: 54 rss: 70Mb L: 37/42 MS: 1 InsertRepeatedBytes- 00:08:21.783 [2024-11-18 19:06:40.238826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:42497 00:08:21.783 [2024-11-18 19:06:40.238855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.783 [2024-11-18 19:06:40.238925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:21.783 [2024-11-18 19:06:40.238949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.783 #55 NEW cov: 11833 ft: 14435 corp: 30/657b lim: 50 exec/s: 55 rss: 70Mb L: 26/42 MS: 1 ChangeByte- 00:08:21.783 [2024-11-18 19:06:40.278948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:181093269504 len:2305 00:08:21.783 [2024-11-18 19:06:40.278978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.783 [2024-11-18 19:06:40.279046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:21.783 [2024-11-18 19:06:40.279068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.783 #56 NEW cov: 11833 ft: 14444 corp: 31/685b lim: 50 exec/s: 56 rss: 70Mb L: 28/42 MS: 1 ChangeBinInt- 00:08:21.783 [2024-11-18 19:06:40.319024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:181093269504 len:1 00:08:21.783 [2024-11-18 19:06:40.319053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.783 [2024-11-18 19:06:40.319122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:141 00:08:21.783 [2024-11-18 19:06:40.319144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.783 #57 NEW cov: 11833 ft: 14478 corp: 32/714b lim: 50 exec/s: 57 rss: 70Mb L: 29/42 MS: 1 InsertByte- 00:08:21.783 [2024-11-18 19:06:40.359015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:704643082 len:1 00:08:21.783 [2024-11-18 19:06:40.359044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.042 #58 NEW cov: 11833 ft: 14494 corp: 33/728b lim: 50 exec/s: 58 rss: 70Mb L: 14/42 MS: 1 CopyPart- 00:08:22.042 [2024-11-18 19:06:40.399234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:282021142200320 len:53013 00:08:22.042 [2024-11-18 19:06:40.399262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.042 [2024-11-18 19:06:40.399330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:302338304 len:1 00:08:22.042 [2024-11-18 19:06:40.399359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.042 #59 NEW cov: 11833 ft: 14505 corp: 34/756b lim: 50 exec/s: 59 rss: 70Mb L: 28/42 MS: 1 PersAutoDict- DE: "\001\000\177\317\024\022\005Q"- 00:08:22.042 [2024-11-18 19:06:40.439264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11076888428544 len:1 00:08:22.042 [2024-11-18 19:06:40.439293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.042 #60 NEW cov: 11833 ft: 14513 corp: 35/774b lim: 50 exec/s: 60 rss: 70Mb L: 18/42 MS: 1 ChangeBinInt- 00:08:22.042 [2024-11-18 19:06:40.479499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14921571278237270143 len:20737 00:08:22.042 [2024-11-18 19:06:40.479527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.042 [2024-11-18 19:06:40.479600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.042 [2024-11-18 19:06:40.479623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.042 #61 NEW cov: 11833 ft: 14520 corp: 36/800b lim: 50 exec/s: 61 rss: 70Mb L: 26/42 MS: 1 PersAutoDict- DE: "\001\000\177\317\024\022\005Q"- 00:08:22.042 [2024-11-18 19:06:40.519609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:282021142200320 len:53013 00:08:22.042 [2024-11-18 19:06:40.519639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.042 [2024-11-18 19:06:40.519725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:72198121375224064 len:5139 00:08:22.042 [2024-11-18 19:06:40.519748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.042 #62 NEW cov: 11833 ft: 14533 corp: 37/828b lim: 50 exec/s: 62 rss: 70Mb L: 28/42 MS: 1 PersAutoDict- DE: "\001\000\177\317\024\022\005Q"- 00:08:22.042 [2024-11-18 19:06:40.559612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:106398054915681065 len:30546 00:08:22.042 [2024-11-18 19:06:40.559643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.042 #63 NEW cov: 11833 ft: 14559 corp: 38/838b lim: 50 exec/s: 63 rss: 70Mb L: 10/42 MS: 1 CopyPart- 00:08:22.042 [2024-11-18 19:06:40.599735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3170545214557257728 len:1 00:08:22.042 [2024-11-18 19:06:40.599765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.043 [2024-11-18 19:06:40.629756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3170545137247846400 len:128 00:08:22.043 [2024-11-18 19:06:40.629784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.301 #65 NEW cov: 11833 ft: 14570 corp: 39/857b lim: 50 exec/s: 65 rss: 70Mb L: 19/42 MS: 2 ChangeByte-PersAutoDict- DE: "\001\000\177\317\024\022\005Q"- 00:08:22.301 [2024-11-18 19:06:40.659978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11665298948096 len:2732 00:08:22.301 [2024-11-18 19:06:40.660009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.301 [2024-11-18 19:06:40.660078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:8608370104150882816 len:1 00:08:22.302 [2024-11-18 19:06:40.660101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.302 #66 NEW cov: 11833 ft: 14620 corp: 40/885b lim: 50 exec/s: 66 rss: 70Mb L: 28/42 MS: 1 CrossOver- 00:08:22.302 [2024-11-18 19:06:40.700210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:22.302 [2024-11-18 19:06:40.700239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.302 [2024-11-18 19:06:40.700302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967296 len:128 00:08:22.302 [2024-11-18 19:06:40.700324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.302 [2024-11-18 19:06:40.700390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446484669764468735 len:1362 00:08:22.302 [2024-11-18 19:06:40.700410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.302 #67 NEW cov: 11833 ft: 14631 corp: 41/916b lim: 50 exec/s: 67 rss: 70Mb L: 31/42 MS: 1 InsertRepeatedBytes- 00:08:22.302 [2024-11-18 19:06:40.740118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:43654316042 len:1 00:08:22.302 [2024-11-18 19:06:40.740148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.302 #68 NEW cov: 11833 ft: 14661 corp: 42/931b lim: 50 exec/s: 68 rss: 70Mb L: 15/42 MS: 1 CrossOver- 00:08:22.302 [2024-11-18 19:06:40.780353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:705101824 len:1 00:08:22.302 [2024-11-18 19:06:40.780383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.302 [2024-11-18 19:06:40.780455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10 len:11 00:08:22.302 [2024-11-18 19:06:40.780477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.302 #69 NEW cov: 11833 ft: 14665 corp: 43/951b lim: 50 exec/s: 69 rss: 70Mb L: 20/42 MS: 1 ChangeBinInt- 00:08:22.302 [2024-11-18 19:06:40.820349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:86603989002 len:61179 00:08:22.302 [2024-11-18 19:06:40.820379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.302 #70 NEW cov: 11833 ft: 14669 corp: 44/969b lim: 50 exec/s: 70 rss: 70Mb L: 18/42 MS: 1 CrossOver- 00:08:22.302 [2024-11-18 19:06:40.860425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:11076888428544 len:1 00:08:22.302 [2024-11-18 19:06:40.860454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.302 #71 NEW cov: 11833 ft: 14714 corp: 45/988b lim: 50 exec/s: 35 rss: 70Mb L: 19/42 MS: 1 ShuffleBytes- 00:08:22.302 #71 DONE cov: 11833 ft: 14714 corp: 45/988b lim: 50 exec/s: 35 rss: 70Mb 00:08:22.302 ###### Recommended dictionary. ###### 00:08:22.302 "\001\000\177\317\024\022\005Q" # Uses: 7 00:08:22.302 ###### End of recommended dictionary. ###### 00:08:22.302 Done 71 runs in 2 second(s) 00:08:22.561 19:06:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:22.561 19:06:41 -- ../common.sh@72 -- # (( i++ )) 00:08:22.561 19:06:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.561 19:06:41 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:22.561 19:06:41 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:22.561 19:06:41 -- nvmf/run.sh@24 -- # local timen=1 00:08:22.561 19:06:41 -- nvmf/run.sh@25 -- # local core=0x1 00:08:22.561 19:06:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:22.561 19:06:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:22.561 19:06:41 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:22.561 19:06:41 -- nvmf/run.sh@29 -- # port=4420 00:08:22.561 19:06:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:22.561 19:06:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:22.561 19:06:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:22.561 19:06:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:22.561 [2024-11-18 19:06:41.054851] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:22.561 [2024-11-18 19:06:41.054930] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1308142 ] 00:08:22.561 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.820 [2024-11-18 19:06:41.234028] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.820 [2024-11-18 19:06:41.299331] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.820 [2024-11-18 19:06:41.299453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.820 [2024-11-18 19:06:41.357597] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:22.820 [2024-11-18 19:06:41.373919] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:22.820 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.820 INFO: Seed: 2827224066 00:08:22.820 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:22.820 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:22.820 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:22.820 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.820 #2 INITED exec/s: 0 rss: 60Mb 00:08:22.820 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.820 This may also happen if the target rejected all inputs we tried so far 00:08:23.078 [2024-11-18 19:06:41.439274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.078 [2024-11-18 19:06:41.439302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.078 [2024-11-18 19:06:41.439339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.078 [2024-11-18 19:06:41.439354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.078 [2024-11-18 19:06:41.439406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.078 [2024-11-18 19:06:41.439422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.336 NEW_FUNC[1/669]: 0x45c9d8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:23.336 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.336 #8 NEW cov: 11648 ft: 11649 corp: 2/67b lim: 90 exec/s: 0 rss: 68Mb L: 66/66 MS: 1 InsertRepeatedBytes- 00:08:23.336 [2024-11-18 19:06:41.749835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.336 [2024-11-18 19:06:41.749877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.336 NEW_FUNC[1/3]: 0xf45318 in posix_sock_flush /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1441 00:08:23.336 NEW_FUNC[2/3]: 0x16add68 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1090 00:08:23.336 #18 NEW cov: 11777 ft: 12970 corp: 3/94b lim: 90 exec/s: 0 rss: 69Mb L: 27/66 MS: 5 ShuffleBytes-InsertByte-ChangeByte-CopyPart-CrossOver- 00:08:23.336 [2024-11-18 19:06:41.799894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.336 [2024-11-18 19:06:41.799923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.336 #20 NEW cov: 11783 ft: 13287 corp: 4/129b lim: 90 exec/s: 0 rss: 69Mb L: 35/66 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:23.336 [2024-11-18 19:06:41.840007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.336 [2024-11-18 19:06:41.840035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.336 #26 NEW cov: 11868 ft: 13532 corp: 5/157b lim: 90 exec/s: 0 rss: 69Mb L: 28/66 MS: 1 InsertByte- 00:08:23.336 [2024-11-18 19:06:41.890126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.336 [2024-11-18 19:06:41.890153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.336 #29 NEW cov: 11868 ft: 13720 corp: 6/187b lim: 90 exec/s: 0 rss: 69Mb L: 30/66 MS: 3 ChangeByte-InsertByte-CrossOver- 00:08:23.336 [2024-11-18 19:06:41.930240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.336 [2024-11-18 19:06:41.930267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.594 #30 NEW cov: 11868 ft: 13835 corp: 7/211b lim: 90 exec/s: 0 rss: 69Mb L: 24/66 MS: 1 EraseBytes- 00:08:23.594 [2024-11-18 19:06:41.970389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.594 [2024-11-18 19:06:41.970417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.594 #31 NEW cov: 11868 ft: 13978 corp: 8/229b lim: 90 exec/s: 0 rss: 69Mb L: 18/66 MS: 1 EraseBytes- 00:08:23.594 [2024-11-18 19:06:42.010501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.594 [2024-11-18 19:06:42.010529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.594 #32 NEW cov: 11868 ft: 14001 corp: 9/253b lim: 90 exec/s: 0 rss: 69Mb L: 24/66 MS: 1 ChangeBit- 00:08:23.594 [2024-11-18 19:06:42.050597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.594 [2024-11-18 19:06:42.050624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.594 #33 NEW cov: 11868 ft: 14037 corp: 10/288b lim: 90 exec/s: 0 rss: 69Mb L: 35/66 MS: 1 CopyPart- 00:08:23.594 [2024-11-18 19:06:42.090713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.594 [2024-11-18 19:06:42.090740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.594 #34 NEW cov: 11868 ft: 14113 corp: 11/307b lim: 90 exec/s: 0 rss: 69Mb L: 19/66 MS: 1 InsertByte- 00:08:23.594 [2024-11-18 19:06:42.130788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.594 [2024-11-18 19:06:42.130816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.594 #35 NEW cov: 11868 ft: 14219 corp: 12/331b lim: 90 exec/s: 0 rss: 69Mb L: 24/66 MS: 1 ShuffleBytes- 00:08:23.594 [2024-11-18 19:06:42.170906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.594 [2024-11-18 19:06:42.170934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.594 #36 NEW cov: 11868 ft: 14234 corp: 13/358b lim: 90 exec/s: 0 rss: 69Mb L: 27/66 MS: 1 ChangeByte- 00:08:23.852 [2024-11-18 19:06:42.211039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.852 [2024-11-18 19:06:42.211067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.852 #37 NEW cov: 11868 ft: 14266 corp: 14/393b lim: 90 exec/s: 0 rss: 70Mb L: 35/66 MS: 1 ShuffleBytes- 00:08:23.852 [2024-11-18 19:06:42.251148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.852 [2024-11-18 19:06:42.251175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.852 #38 NEW cov: 11868 ft: 14312 corp: 15/428b lim: 90 exec/s: 0 rss: 70Mb L: 35/66 MS: 1 ChangeBinInt- 00:08:23.852 [2024-11-18 19:06:42.291730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.852 [2024-11-18 19:06:42.291757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.852 [2024-11-18 19:06:42.291799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.853 [2024-11-18 19:06:42.291815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.853 [2024-11-18 19:06:42.291867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.853 [2024-11-18 19:06:42.291881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.853 [2024-11-18 19:06:42.291934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.853 [2024-11-18 19:06:42.291949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.853 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.853 #39 NEW cov: 11891 ft: 14765 corp: 16/506b lim: 90 exec/s: 0 rss: 70Mb L: 78/78 MS: 1 CrossOver- 00:08:23.853 [2024-11-18 19:06:42.331674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.853 [2024-11-18 19:06:42.331700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.853 [2024-11-18 19:06:42.331738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.853 [2024-11-18 19:06:42.331754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.853 [2024-11-18 19:06:42.331810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.853 [2024-11-18 19:06:42.331825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.853 #40 NEW cov: 11891 ft: 14809 corp: 17/572b lim: 90 exec/s: 0 rss: 70Mb L: 66/78 MS: 1 ShuffleBytes- 00:08:23.853 [2024-11-18 19:06:42.371500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.853 [2024-11-18 19:06:42.371527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.853 #41 NEW cov: 11891 ft: 14818 corp: 18/596b lim: 90 exec/s: 0 rss: 70Mb L: 24/78 MS: 1 ChangeBinInt- 00:08:23.853 [2024-11-18 19:06:42.411633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.853 [2024-11-18 19:06:42.411660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.853 #42 NEW cov: 11891 ft: 14832 corp: 19/615b lim: 90 exec/s: 42 rss: 70Mb L: 19/78 MS: 1 ChangeBinInt- 00:08:23.853 [2024-11-18 19:06:42.452079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.853 [2024-11-18 19:06:42.452123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.853 [2024-11-18 19:06:42.452163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.853 [2024-11-18 19:06:42.452179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.853 [2024-11-18 19:06:42.452235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.853 [2024-11-18 19:06:42.452252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.111 #43 NEW cov: 11891 ft: 14847 corp: 20/684b lim: 90 exec/s: 43 rss: 70Mb L: 69/78 MS: 1 InsertRepeatedBytes- 00:08:24.111 [2024-11-18 19:06:42.491842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.111 [2024-11-18 19:06:42.491869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.111 #44 NEW cov: 11891 ft: 14883 corp: 21/708b lim: 90 exec/s: 44 rss: 70Mb L: 24/78 MS: 1 EraseBytes- 00:08:24.111 [2024-11-18 19:06:42.531994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.111 [2024-11-18 19:06:42.532021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.111 #45 NEW cov: 11891 ft: 14894 corp: 22/732b lim: 90 exec/s: 45 rss: 70Mb L: 24/78 MS: 1 ChangeByte- 00:08:24.111 [2024-11-18 19:06:42.572123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.111 [2024-11-18 19:06:42.572151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.111 #46 NEW cov: 11891 ft: 14900 corp: 23/756b lim: 90 exec/s: 46 rss: 70Mb L: 24/78 MS: 1 ChangeBinInt- 00:08:24.111 [2024-11-18 19:06:42.612373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.111 [2024-11-18 19:06:42.612401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.111 [2024-11-18 19:06:42.612456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.111 [2024-11-18 19:06:42.612473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.111 #52 NEW cov: 11891 ft: 15183 corp: 24/792b lim: 90 exec/s: 52 rss: 70Mb L: 36/78 MS: 1 EraseBytes- 00:08:24.111 [2024-11-18 19:06:42.652774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.111 [2024-11-18 19:06:42.652801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.111 [2024-11-18 19:06:42.652848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.111 [2024-11-18 19:06:42.652864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.111 [2024-11-18 19:06:42.652916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.111 [2024-11-18 19:06:42.652932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.111 [2024-11-18 19:06:42.652985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.111 [2024-11-18 19:06:42.653001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.111 #53 NEW cov: 11891 ft: 15187 corp: 25/879b lim: 90 exec/s: 53 rss: 70Mb L: 87/87 MS: 1 InsertRepeatedBytes- 00:08:24.111 [2024-11-18 19:06:42.692425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.111 [2024-11-18 19:06:42.692451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.369 #54 NEW cov: 11891 ft: 15290 corp: 26/904b lim: 90 exec/s: 54 rss: 70Mb L: 25/87 MS: 1 CrossOver- 00:08:24.369 [2024-11-18 19:06:42.733005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.369 [2024-11-18 19:06:42.733033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.369 [2024-11-18 19:06:42.733070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.369 [2024-11-18 19:06:42.733085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.369 [2024-11-18 19:06:42.733137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.370 [2024-11-18 19:06:42.733152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.370 [2024-11-18 19:06:42.733206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.370 [2024-11-18 19:06:42.733221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.370 #55 NEW cov: 11891 ft: 15330 corp: 27/991b lim: 90 exec/s: 55 rss: 70Mb L: 87/87 MS: 1 CopyPart- 00:08:24.370 [2024-11-18 19:06:42.773122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.370 [2024-11-18 19:06:42.773150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.370 [2024-11-18 19:06:42.773193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.370 [2024-11-18 19:06:42.773208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.370 [2024-11-18 19:06:42.773261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.370 [2024-11-18 19:06:42.773276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.370 [2024-11-18 19:06:42.773330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.370 [2024-11-18 19:06:42.773346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.370 #61 NEW cov: 11891 ft: 15340 corp: 28/1068b lim: 90 exec/s: 61 rss: 70Mb L: 77/87 MS: 1 CopyPart- 00:08:24.370 [2024-11-18 19:06:42.812784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.370 [2024-11-18 19:06:42.812812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.370 #62 NEW cov: 11891 ft: 15353 corp: 29/1091b lim: 90 exec/s: 62 rss: 70Mb L: 23/87 MS: 1 EraseBytes- 00:08:24.370 [2024-11-18 19:06:42.852916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.370 [2024-11-18 19:06:42.852944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.370 #63 NEW cov: 11891 ft: 15376 corp: 30/1126b lim: 90 exec/s: 63 rss: 70Mb L: 35/87 MS: 1 ShuffleBytes- 00:08:24.370 [2024-11-18 19:06:42.892992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.370 [2024-11-18 19:06:42.893019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.370 #64 NEW cov: 11891 ft: 15409 corp: 31/1150b lim: 90 exec/s: 64 rss: 70Mb L: 24/87 MS: 1 ChangeBit- 00:08:24.370 [2024-11-18 19:06:42.933560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.370 [2024-11-18 19:06:42.933583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.370 [2024-11-18 19:06:42.933603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.370 [2024-11-18 19:06:42.933619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.370 [2024-11-18 19:06:42.933673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.370 [2024-11-18 19:06:42.933688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.370 [2024-11-18 19:06:42.933728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.370 [2024-11-18 19:06:42.933743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.370 #65 NEW cov: 11891 ft: 15413 corp: 32/1227b lim: 90 exec/s: 65 rss: 70Mb L: 77/87 MS: 1 InsertRepeatedBytes- 00:08:24.628 [2024-11-18 19:06:42.973263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.628 [2024-11-18 19:06:42.973293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.628 #66 NEW cov: 11891 ft: 15434 corp: 33/1250b lim: 90 exec/s: 66 rss: 70Mb L: 23/87 MS: 1 ChangeBinInt- 00:08:24.628 [2024-11-18 19:06:43.013696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.628 [2024-11-18 19:06:43.013723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.628 [2024-11-18 19:06:43.013760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.628 [2024-11-18 19:06:43.013776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.628 [2024-11-18 19:06:43.013829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.629 [2024-11-18 19:06:43.013845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.629 #67 NEW cov: 11891 ft: 15439 corp: 34/1304b lim: 90 exec/s: 67 rss: 70Mb L: 54/87 MS: 1 EraseBytes- 00:08:24.629 [2024-11-18 19:06:43.053428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.629 [2024-11-18 19:06:43.053454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.629 #68 NEW cov: 11891 ft: 15446 corp: 35/1331b lim: 90 exec/s: 68 rss: 70Mb L: 27/87 MS: 1 ChangeBit- 00:08:24.629 [2024-11-18 19:06:43.093735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.629 [2024-11-18 19:06:43.093761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.629 [2024-11-18 19:06:43.093813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.629 [2024-11-18 19:06:43.093829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.629 #69 NEW cov: 11891 ft: 15449 corp: 36/1378b lim: 90 exec/s: 69 rss: 70Mb L: 47/87 MS: 1 InsertRepeatedBytes- 00:08:24.629 [2024-11-18 19:06:43.133767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.629 [2024-11-18 19:06:43.133795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.629 #70 NEW cov: 11891 ft: 15471 corp: 37/1402b lim: 90 exec/s: 70 rss: 70Mb L: 24/87 MS: 1 EraseBytes- 00:08:24.629 [2024-11-18 19:06:43.173801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.629 [2024-11-18 19:06:43.173827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.629 #71 NEW cov: 11891 ft: 15476 corp: 38/1432b lim: 90 exec/s: 71 rss: 70Mb L: 30/87 MS: 1 CopyPart- 00:08:24.629 [2024-11-18 19:06:43.213930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.629 [2024-11-18 19:06:43.213957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.887 #72 NEW cov: 11891 ft: 15487 corp: 39/1467b lim: 90 exec/s: 72 rss: 70Mb L: 35/87 MS: 1 CrossOver- 00:08:24.888 [2024-11-18 19:06:43.254034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.888 [2024-11-18 19:06:43.254060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.888 #73 NEW cov: 11891 ft: 15498 corp: 40/1486b lim: 90 exec/s: 73 rss: 70Mb L: 19/87 MS: 1 ChangeBit- 00:08:24.888 [2024-11-18 19:06:43.294315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.888 [2024-11-18 19:06:43.294341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.888 [2024-11-18 19:06:43.294385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.888 [2024-11-18 19:06:43.294401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.888 #74 NEW cov: 11891 ft: 15513 corp: 41/1523b lim: 90 exec/s: 74 rss: 70Mb L: 37/87 MS: 1 CrossOver- 00:08:24.888 [2024-11-18 19:06:43.334246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.888 [2024-11-18 19:06:43.334273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.888 #75 NEW cov: 11891 ft: 15522 corp: 42/1558b lim: 90 exec/s: 75 rss: 70Mb L: 35/87 MS: 1 ShuffleBytes- 00:08:24.888 [2024-11-18 19:06:43.374538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.888 [2024-11-18 19:06:43.374569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.888 [2024-11-18 19:06:43.374621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.888 [2024-11-18 19:06:43.374637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.888 [2024-11-18 19:06:43.414944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.888 [2024-11-18 19:06:43.414969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.888 [2024-11-18 19:06:43.415032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.888 [2024-11-18 19:06:43.415049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.888 [2024-11-18 19:06:43.415102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.888 [2024-11-18 19:06:43.415129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.888 [2024-11-18 19:06:43.415200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.888 [2024-11-18 19:06:43.415219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.888 #77 NEW cov: 11891 ft: 15541 corp: 43/1630b lim: 90 exec/s: 38 rss: 70Mb L: 72/87 MS: 2 ChangeASCIIInt-InsertRepeatedBytes- 00:08:24.888 #77 DONE cov: 11891 ft: 15541 corp: 43/1630b lim: 90 exec/s: 38 rss: 70Mb 00:08:24.888 Done 77 runs in 2 second(s) 00:08:25.147 19:06:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:25.147 19:06:43 -- ../common.sh@72 -- # (( i++ )) 00:08:25.147 19:06:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.147 19:06:43 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:25.147 19:06:43 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:25.147 19:06:43 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.147 19:06:43 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.147 19:06:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.147 19:06:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:25.147 19:06:43 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:25.147 19:06:43 -- nvmf/run.sh@29 -- # port=4421 00:08:25.147 19:06:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.147 19:06:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:25.148 19:06:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.148 19:06:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:25.148 [2024-11-18 19:06:43.599716] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:25.148 [2024-11-18 19:06:43.599785] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1308674 ] 00:08:25.148 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.406 [2024-11-18 19:06:43.774512] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.406 [2024-11-18 19:06:43.838035] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.406 [2024-11-18 19:06:43.838151] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.406 [2024-11-18 19:06:43.896089] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.406 [2024-11-18 19:06:43.912410] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:25.406 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.406 INFO: Seed: 1068256199 00:08:25.406 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:25.407 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:25.407 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.407 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.407 #2 INITED exec/s: 0 rss: 60Mb 00:08:25.407 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.407 This may also happen if the target rejected all inputs we tried so far 00:08:25.407 [2024-11-18 19:06:43.960757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.407 [2024-11-18 19:06:43.960790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.665 NEW_FUNC[1/672]: 0x45fc08 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:25.665 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:25.665 #24 NEW cov: 11638 ft: 11639 corp: 2/12b lim: 50 exec/s: 0 rss: 68Mb L: 11/11 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:25.665 [2024-11-18 19:06:44.261639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.665 [2024-11-18 19:06:44.261680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.924 #30 NEW cov: 11752 ft: 12050 corp: 3/23b lim: 50 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 ShuffleBytes- 00:08:25.924 [2024-11-18 19:06:44.311674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.924 [2024-11-18 19:06:44.311705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.924 #31 NEW cov: 11758 ft: 12351 corp: 4/36b lim: 50 exec/s: 0 rss: 69Mb L: 13/13 MS: 1 CopyPart- 00:08:25.924 [2024-11-18 19:06:44.351786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.924 [2024-11-18 19:06:44.351815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.924 #34 NEW cov: 11843 ft: 12666 corp: 5/46b lim: 50 exec/s: 0 rss: 69Mb L: 10/13 MS: 3 CopyPart-InsertByte-CMP- DE: "\022,\016,?\177\000\000"- 00:08:25.924 [2024-11-18 19:06:44.391878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.924 [2024-11-18 19:06:44.391907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.924 #40 NEW cov: 11843 ft: 12883 corp: 6/56b lim: 50 exec/s: 0 rss: 69Mb L: 10/13 MS: 1 ShuffleBytes- 00:08:25.924 [2024-11-18 19:06:44.431978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.924 [2024-11-18 19:06:44.432008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.924 #46 NEW cov: 11843 ft: 13011 corp: 7/67b lim: 50 exec/s: 0 rss: 69Mb L: 11/13 MS: 1 InsertByte- 00:08:25.924 [2024-11-18 19:06:44.472077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.924 [2024-11-18 19:06:44.472105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.924 #47 NEW cov: 11843 ft: 13073 corp: 8/77b lim: 50 exec/s: 0 rss: 69Mb L: 10/13 MS: 1 CopyPart- 00:08:25.924 [2024-11-18 19:06:44.512202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.924 [2024-11-18 19:06:44.512231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.183 #48 NEW cov: 11843 ft: 13126 corp: 9/96b lim: 50 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 PersAutoDict- DE: "\022,\016,?\177\000\000"- 00:08:26.183 [2024-11-18 19:06:44.552332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.183 [2024-11-18 19:06:44.552362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.183 #49 NEW cov: 11843 ft: 13166 corp: 10/107b lim: 50 exec/s: 0 rss: 69Mb L: 11/19 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:26.183 [2024-11-18 19:06:44.592440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.183 [2024-11-18 19:06:44.592469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.183 #50 NEW cov: 11843 ft: 13218 corp: 11/117b lim: 50 exec/s: 0 rss: 69Mb L: 10/19 MS: 1 CrossOver- 00:08:26.183 [2024-11-18 19:06:44.622864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.183 [2024-11-18 19:06:44.622893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.183 [2024-11-18 19:06:44.622967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.183 [2024-11-18 19:06:44.622988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.183 [2024-11-18 19:06:44.623056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.183 [2024-11-18 19:06:44.623077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.183 #51 NEW cov: 11843 ft: 14052 corp: 12/149b lim: 50 exec/s: 0 rss: 69Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:26.183 [2024-11-18 19:06:44.663141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.183 [2024-11-18 19:06:44.663170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.183 [2024-11-18 19:06:44.663236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.183 [2024-11-18 19:06:44.663258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.183 [2024-11-18 19:06:44.663322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.183 [2024-11-18 19:06:44.663340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.183 [2024-11-18 19:06:44.663406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.183 [2024-11-18 19:06:44.663425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.183 #52 NEW cov: 11843 ft: 14456 corp: 13/194b lim: 50 exec/s: 0 rss: 69Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:26.183 [2024-11-18 19:06:44.712786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.183 [2024-11-18 19:06:44.712815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.183 #58 NEW cov: 11843 ft: 14473 corp: 14/208b lim: 50 exec/s: 0 rss: 69Mb L: 14/45 MS: 1 CopyPart- 00:08:26.183 [2024-11-18 19:06:44.753234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.183 [2024-11-18 19:06:44.753262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.183 [2024-11-18 19:06:44.753330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.183 [2024-11-18 19:06:44.753352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.183 [2024-11-18 19:06:44.753417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.183 [2024-11-18 19:06:44.753435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.183 #59 NEW cov: 11843 ft: 14571 corp: 15/239b lim: 50 exec/s: 0 rss: 69Mb L: 31/45 MS: 1 InsertRepeatedBytes- 00:08:26.443 [2024-11-18 19:06:44.793045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-11-18 19:06:44.793074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 #60 NEW cov: 11843 ft: 14588 corp: 16/250b lim: 50 exec/s: 0 rss: 69Mb L: 11/45 MS: 1 ChangeBinInt- 00:08:26.443 [2024-11-18 19:06:44.833160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-11-18 19:06:44.833189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.443 #61 NEW cov: 11866 ft: 14628 corp: 17/260b lim: 50 exec/s: 0 rss: 70Mb L: 10/45 MS: 1 ShuffleBytes- 00:08:26.443 [2024-11-18 19:06:44.873273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-11-18 19:06:44.873302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 #67 NEW cov: 11866 ft: 14674 corp: 18/270b lim: 50 exec/s: 0 rss: 70Mb L: 10/45 MS: 1 ChangeBinInt- 00:08:26.443 [2024-11-18 19:06:44.913723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-11-18 19:06:44.913752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 [2024-11-18 19:06:44.913817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.443 [2024-11-18 19:06:44.913839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.443 [2024-11-18 19:06:44.913904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.443 [2024-11-18 19:06:44.913922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.443 #69 NEW cov: 11866 ft: 14675 corp: 19/309b lim: 50 exec/s: 0 rss: 70Mb L: 39/45 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:26.443 [2024-11-18 19:06:44.953504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-11-18 19:06:44.953533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 #70 NEW cov: 11866 ft: 14725 corp: 20/323b lim: 50 exec/s: 70 rss: 70Mb L: 14/45 MS: 1 InsertByte- 00:08:26.443 [2024-11-18 19:06:44.993671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-11-18 19:06:44.993700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.443 #71 NEW cov: 11866 ft: 14731 corp: 21/334b lim: 50 exec/s: 71 rss: 70Mb L: 11/45 MS: 1 InsertByte- 00:08:26.443 [2024-11-18 19:06:45.023741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.443 [2024-11-18 19:06:45.023769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 #72 NEW cov: 11866 ft: 14768 corp: 22/348b lim: 50 exec/s: 72 rss: 70Mb L: 14/45 MS: 1 ChangeByte- 00:08:26.702 [2024-11-18 19:06:45.063879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.702 [2024-11-18 19:06:45.063908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 #73 NEW cov: 11866 ft: 14796 corp: 23/358b lim: 50 exec/s: 73 rss: 70Mb L: 10/45 MS: 1 CopyPart- 00:08:26.702 [2024-11-18 19:06:45.093946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.702 [2024-11-18 19:06:45.093974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 #74 NEW cov: 11866 ft: 14826 corp: 24/368b lim: 50 exec/s: 74 rss: 70Mb L: 10/45 MS: 1 CMP- DE: "\376\377"- 00:08:26.702 [2024-11-18 19:06:45.134033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.702 [2024-11-18 19:06:45.134061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 #75 NEW cov: 11866 ft: 14844 corp: 25/379b lim: 50 exec/s: 75 rss: 70Mb L: 11/45 MS: 1 ShuffleBytes- 00:08:26.702 [2024-11-18 19:06:45.174175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.702 [2024-11-18 19:06:45.174208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 #76 NEW cov: 11866 ft: 14881 corp: 26/390b lim: 50 exec/s: 76 rss: 70Mb L: 11/45 MS: 1 ChangeBinInt- 00:08:26.702 [2024-11-18 19:06:45.214346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.702 [2024-11-18 19:06:45.214376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 #77 NEW cov: 11866 ft: 14891 corp: 27/401b lim: 50 exec/s: 77 rss: 70Mb L: 11/45 MS: 1 CrossOver- 00:08:26.702 [2024-11-18 19:06:45.254704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.702 [2024-11-18 19:06:45.254733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 [2024-11-18 19:06:45.254798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.702 [2024-11-18 19:06:45.254819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.702 [2024-11-18 19:06:45.254885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.702 [2024-11-18 19:06:45.254905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.702 #78 NEW cov: 11866 ft: 14921 corp: 28/433b lim: 50 exec/s: 78 rss: 70Mb L: 32/45 MS: 1 InsertByte- 00:08:26.702 [2024-11-18 19:06:45.294683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.702 [2024-11-18 19:06:45.294712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.702 [2024-11-18 19:06:45.294782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.702 [2024-11-18 19:06:45.294805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.961 #79 NEW cov: 11866 ft: 15228 corp: 29/458b lim: 50 exec/s: 79 rss: 70Mb L: 25/45 MS: 1 CopyPart- 00:08:26.961 [2024-11-18 19:06:45.344689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.961 [2024-11-18 19:06:45.344719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.961 #80 NEW cov: 11866 ft: 15241 corp: 30/469b lim: 50 exec/s: 80 rss: 70Mb L: 11/45 MS: 1 ChangeBinInt- 00:08:26.961 [2024-11-18 19:06:45.384765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.961 [2024-11-18 19:06:45.384794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.961 #81 NEW cov: 11866 ft: 15248 corp: 31/481b lim: 50 exec/s: 81 rss: 70Mb L: 12/45 MS: 1 InsertByte- 00:08:26.961 [2024-11-18 19:06:45.414833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.961 [2024-11-18 19:06:45.414861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.961 #82 NEW cov: 11866 ft: 15262 corp: 32/495b lim: 50 exec/s: 82 rss: 70Mb L: 14/45 MS: 1 ChangeBit- 00:08:26.961 [2024-11-18 19:06:45.454983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.961 [2024-11-18 19:06:45.455012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.961 #88 NEW cov: 11866 ft: 15297 corp: 33/506b lim: 50 exec/s: 88 rss: 70Mb L: 11/45 MS: 1 PersAutoDict- DE: "\022,\016,?\177\000\000"- 00:08:26.961 [2024-11-18 19:06:45.485063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.961 [2024-11-18 19:06:45.485095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.961 #89 NEW cov: 11866 ft: 15315 corp: 34/517b lim: 50 exec/s: 89 rss: 70Mb L: 11/45 MS: 1 InsertByte- 00:08:26.961 [2024-11-18 19:06:45.525211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.961 [2024-11-18 19:06:45.525240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.961 #90 NEW cov: 11866 ft: 15371 corp: 35/529b lim: 50 exec/s: 90 rss: 70Mb L: 12/45 MS: 1 CopyPart- 00:08:27.219 [2024-11-18 19:06:45.565391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.219 [2024-11-18 19:06:45.565421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.219 #91 NEW cov: 11866 ft: 15439 corp: 36/539b lim: 50 exec/s: 91 rss: 70Mb L: 10/45 MS: 1 ChangeBinInt- 00:08:27.219 [2024-11-18 19:06:45.605416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.219 [2024-11-18 19:06:45.605445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.219 #95 NEW cov: 11866 ft: 15457 corp: 37/558b lim: 50 exec/s: 95 rss: 70Mb L: 19/45 MS: 4 ShuffleBytes-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:27.219 [2024-11-18 19:06:45.645509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.219 [2024-11-18 19:06:45.645539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.219 #96 NEW cov: 11866 ft: 15471 corp: 38/568b lim: 50 exec/s: 96 rss: 70Mb L: 10/45 MS: 1 CopyPart- 00:08:27.219 [2024-11-18 19:06:45.685851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.219 [2024-11-18 19:06:45.685881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.219 [2024-11-18 19:06:45.685951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.219 [2024-11-18 19:06:45.685976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.219 #97 NEW cov: 11866 ft: 15491 corp: 39/596b lim: 50 exec/s: 97 rss: 70Mb L: 28/45 MS: 1 InsertRepeatedBytes- 00:08:27.219 [2024-11-18 19:06:45.725894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.219 [2024-11-18 19:06:45.725923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.219 [2024-11-18 19:06:45.725994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.219 [2024-11-18 19:06:45.726017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.219 #98 NEW cov: 11866 ft: 15507 corp: 40/618b lim: 50 exec/s: 98 rss: 70Mb L: 22/45 MS: 1 InsertRepeatedBytes- 00:08:27.219 [2024-11-18 19:06:45.765937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.219 [2024-11-18 19:06:45.765966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.219 #99 NEW cov: 11866 ft: 15517 corp: 41/628b lim: 50 exec/s: 99 rss: 70Mb L: 10/45 MS: 1 ChangeByte- 00:08:27.219 [2024-11-18 19:06:45.796418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.219 [2024-11-18 19:06:45.796446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.219 [2024-11-18 19:06:45.796509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.219 [2024-11-18 19:06:45.796534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.219 [2024-11-18 19:06:45.796612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.219 [2024-11-18 19:06:45.796636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.219 [2024-11-18 19:06:45.796696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.219 [2024-11-18 19:06:45.796713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.479 #102 NEW cov: 11866 ft: 15555 corp: 42/674b lim: 50 exec/s: 102 rss: 70Mb L: 46/46 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:27.479 [2024-11-18 19:06:45.846146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.479 [2024-11-18 19:06:45.846173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.479 [2024-11-18 19:06:45.886226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.479 [2024-11-18 19:06:45.886252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.479 #104 NEW cov: 11866 ft: 15577 corp: 43/690b lim: 50 exec/s: 104 rss: 70Mb L: 16/46 MS: 2 ShuffleBytes-PersAutoDict- DE: "\376\377"- 00:08:27.479 [2024-11-18 19:06:45.926495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.479 [2024-11-18 19:06:45.926522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.479 [2024-11-18 19:06:45.926566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.479 [2024-11-18 19:06:45.926583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.479 #105 NEW cov: 11866 ft: 15591 corp: 44/712b lim: 50 exec/s: 52 rss: 70Mb L: 22/46 MS: 1 ChangeBit- 00:08:27.479 #105 DONE cov: 11866 ft: 15591 corp: 44/712b lim: 50 exec/s: 52 rss: 70Mb 00:08:27.479 ###### Recommended dictionary. ###### 00:08:27.479 "\022,\016,?\177\000\000" # Uses: 2 00:08:27.479 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:27.479 "\376\377" # Uses: 1 00:08:27.479 ###### End of recommended dictionary. ###### 00:08:27.479 Done 105 runs in 2 second(s) 00:08:27.479 19:06:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:27.479 19:06:46 -- ../common.sh@72 -- # (( i++ )) 00:08:27.479 19:06:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.479 19:06:46 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:27.479 19:06:46 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:27.479 19:06:46 -- nvmf/run.sh@24 -- # local timen=1 00:08:27.479 19:06:46 -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.479 19:06:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.479 19:06:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:27.479 19:06:46 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:27.738 19:06:46 -- nvmf/run.sh@29 -- # port=4422 00:08:27.738 19:06:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.738 19:06:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:27.738 19:06:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.738 19:06:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:27.738 [2024-11-18 19:06:46.117230] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:27.738 [2024-11-18 19:06:46.117310] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1309161 ] 00:08:27.738 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.738 [2024-11-18 19:06:46.293147] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.998 [2024-11-18 19:06:46.356853] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:27.998 [2024-11-18 19:06:46.356989] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.998 [2024-11-18 19:06:46.414946] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:27.998 [2024-11-18 19:06:46.431272] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:27.998 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.998 INFO: Seed: 3589258100 00:08:27.998 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:27.998 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:27.998 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.998 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.998 #2 INITED exec/s: 0 rss: 60Mb 00:08:27.998 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.998 This may also happen if the target rejected all inputs we tried so far 00:08:27.998 [2024-11-18 19:06:46.475990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.998 [2024-11-18 19:06:46.476022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.998 [2024-11-18 19:06:46.476071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.998 [2024-11-18 19:06:46.476093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.998 [2024-11-18 19:06:46.476122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:27.998 [2024-11-18 19:06:46.476138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.998 [2024-11-18 19:06:46.476166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:27.998 [2024-11-18 19:06:46.476182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.257 NEW_FUNC[1/672]: 0x461ed8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:28.257 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.257 #9 NEW cov: 11665 ft: 11666 corp: 2/77b lim: 85 exec/s: 0 rss: 68Mb L: 76/76 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:28.257 [2024-11-18 19:06:46.796750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.257 [2024-11-18 19:06:46.796785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.257 [2024-11-18 19:06:46.796834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.257 [2024-11-18 19:06:46.796854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.257 [2024-11-18 19:06:46.796883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.257 [2024-11-18 19:06:46.796899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.257 [2024-11-18 19:06:46.796932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.257 [2024-11-18 19:06:46.796948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.257 #10 NEW cov: 11778 ft: 12116 corp: 3/153b lim: 85 exec/s: 0 rss: 69Mb L: 76/76 MS: 1 ShuffleBytes- 00:08:28.516 [2024-11-18 19:06:46.866866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.516 [2024-11-18 19:06:46.866896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.516 [2024-11-18 19:06:46.866929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.516 [2024-11-18 19:06:46.866946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.516 [2024-11-18 19:06:46.866977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.516 [2024-11-18 19:06:46.866993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.516 [2024-11-18 19:06:46.867022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.516 [2024-11-18 19:06:46.867038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.516 #11 NEW cov: 11784 ft: 12403 corp: 4/233b lim: 85 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 CrossOver- 00:08:28.516 [2024-11-18 19:06:46.916756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.516 [2024-11-18 19:06:46.916785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.516 #12 NEW cov: 11869 ft: 13602 corp: 5/265b lim: 85 exec/s: 0 rss: 69Mb L: 32/80 MS: 1 CrossOver- 00:08:28.516 [2024-11-18 19:06:46.986939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.516 [2024-11-18 19:06:46.986968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.516 #13 NEW cov: 11869 ft: 13704 corp: 6/297b lim: 85 exec/s: 0 rss: 69Mb L: 32/80 MS: 1 CopyPart- 00:08:28.516 [2024-11-18 19:06:47.057223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.516 [2024-11-18 19:06:47.057252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.516 [2024-11-18 19:06:47.057300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.516 [2024-11-18 19:06:47.057317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.516 [2024-11-18 19:06:47.057351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.517 [2024-11-18 19:06:47.057368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.517 #14 NEW cov: 11869 ft: 14130 corp: 7/360b lim: 85 exec/s: 0 rss: 69Mb L: 63/80 MS: 1 CrossOver- 00:08:28.517 [2024-11-18 19:06:47.117491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.517 [2024-11-18 19:06:47.117537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.517 [2024-11-18 19:06:47.117577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.517 [2024-11-18 19:06:47.117595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.517 [2024-11-18 19:06:47.117630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.517 [2024-11-18 19:06:47.117647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.517 [2024-11-18 19:06:47.117675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.517 [2024-11-18 19:06:47.117692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.776 #15 NEW cov: 11869 ft: 14223 corp: 8/440b lim: 85 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 ShuffleBytes- 00:08:28.776 [2024-11-18 19:06:47.177854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.776 [2024-11-18 19:06:47.177882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.776 [2024-11-18 19:06:47.177930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.776 [2024-11-18 19:06:47.177952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.776 [2024-11-18 19:06:47.177983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.776 [2024-11-18 19:06:47.177999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.776 #16 NEW cov: 11869 ft: 14361 corp: 9/503b lim: 85 exec/s: 0 rss: 69Mb L: 63/80 MS: 1 ShuffleBytes- 00:08:28.776 [2024-11-18 19:06:47.247641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.776 [2024-11-18 19:06:47.247671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.776 #17 NEW cov: 11869 ft: 14406 corp: 10/536b lim: 85 exec/s: 0 rss: 69Mb L: 33/80 MS: 1 InsertByte- 00:08:28.776 [2024-11-18 19:06:47.317991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.776 [2024-11-18 19:06:47.318021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.776 [2024-11-18 19:06:47.318053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.776 [2024-11-18 19:06:47.318071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.776 [2024-11-18 19:06:47.318104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.776 [2024-11-18 19:06:47.318120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.776 [2024-11-18 19:06:47.318148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.776 [2024-11-18 19:06:47.318164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.776 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:28.776 #18 NEW cov: 11886 ft: 14472 corp: 11/616b lim: 85 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 ChangeByte- 00:08:29.035 [2024-11-18 19:06:47.387983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.035 [2024-11-18 19:06:47.388012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.035 #19 NEW cov: 11886 ft: 14527 corp: 12/648b lim: 85 exec/s: 0 rss: 69Mb L: 32/80 MS: 1 ChangeByte- 00:08:29.035 [2024-11-18 19:06:47.438234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.035 [2024-11-18 19:06:47.438264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.438317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.035 [2024-11-18 19:06:47.438335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.438365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.035 [2024-11-18 19:06:47.438381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.035 #20 NEW cov: 11886 ft: 14556 corp: 13/711b lim: 85 exec/s: 20 rss: 69Mb L: 63/80 MS: 1 CopyPart- 00:08:29.035 [2024-11-18 19:06:47.488361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.035 [2024-11-18 19:06:47.488391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.488438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.035 [2024-11-18 19:06:47.488461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.488490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.035 [2024-11-18 19:06:47.488507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.035 #21 NEW cov: 11886 ft: 14583 corp: 14/774b lim: 85 exec/s: 21 rss: 70Mb L: 63/80 MS: 1 ChangeByte- 00:08:29.035 [2024-11-18 19:06:47.558600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.035 [2024-11-18 19:06:47.558630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.558677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.035 [2024-11-18 19:06:47.558700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.558730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.035 [2024-11-18 19:06:47.558745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.558774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.035 [2024-11-18 19:06:47.558789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.035 #22 NEW cov: 11886 ft: 14623 corp: 15/851b lim: 85 exec/s: 22 rss: 70Mb L: 77/80 MS: 1 InsertByte- 00:08:29.035 [2024-11-18 19:06:47.608686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.035 [2024-11-18 19:06:47.608714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.608762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.035 [2024-11-18 19:06:47.608785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.035 [2024-11-18 19:06:47.608815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.035 [2024-11-18 19:06:47.608831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.294 #23 NEW cov: 11886 ft: 14656 corp: 16/914b lim: 85 exec/s: 23 rss: 70Mb L: 63/80 MS: 1 CrossOver- 00:08:29.294 [2024-11-18 19:06:47.678910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.294 [2024-11-18 19:06:47.678942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.294 [2024-11-18 19:06:47.678990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.294 [2024-11-18 19:06:47.679014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.294 [2024-11-18 19:06:47.679043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.294 [2024-11-18 19:06:47.679059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.294 [2024-11-18 19:06:47.679088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.294 [2024-11-18 19:06:47.679104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.294 #34 NEW cov: 11886 ft: 14682 corp: 17/994b lim: 85 exec/s: 34 rss: 70Mb L: 80/80 MS: 1 ChangeBit- 00:08:29.294 [2024-11-18 19:06:47.728886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.294 [2024-11-18 19:06:47.728916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.294 #39 NEW cov: 11886 ft: 14754 corp: 18/1016b lim: 85 exec/s: 39 rss: 70Mb L: 22/80 MS: 5 ChangeBit-CopyPart-CMP-CrossOver-InsertRepeatedBytes- DE: "\001\000"- 00:08:29.294 [2024-11-18 19:06:47.789158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.295 [2024-11-18 19:06:47.789187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.789235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.295 [2024-11-18 19:06:47.789258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.789288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.295 [2024-11-18 19:06:47.789304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.295 #40 NEW cov: 11886 ft: 14842 corp: 19/1079b lim: 85 exec/s: 40 rss: 70Mb L: 63/80 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:29.295 [2024-11-18 19:06:47.830125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.295 [2024-11-18 19:06:47.830153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.830191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.295 [2024-11-18 19:06:47.830207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.830258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.295 [2024-11-18 19:06:47.830272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.830324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.295 [2024-11-18 19:06:47.830339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.295 #41 NEW cov: 11886 ft: 14979 corp: 20/1159b lim: 85 exec/s: 41 rss: 70Mb L: 80/80 MS: 1 ChangeBinInt- 00:08:29.295 [2024-11-18 19:06:47.870242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.295 [2024-11-18 19:06:47.870272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.870308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.295 [2024-11-18 19:06:47.870324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.870375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.295 [2024-11-18 19:06:47.870391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.295 [2024-11-18 19:06:47.870443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.295 [2024-11-18 19:06:47.870457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.295 #42 NEW cov: 11886 ft: 15068 corp: 21/1242b lim: 85 exec/s: 42 rss: 70Mb L: 83/83 MS: 1 CopyPart- 00:08:29.554 [2024-11-18 19:06:47.910340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.554 [2024-11-18 19:06:47.910367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.910407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.554 [2024-11-18 19:06:47.910424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.910475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.554 [2024-11-18 19:06:47.910491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.910544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.554 [2024-11-18 19:06:47.910563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.554 #43 NEW cov: 11886 ft: 15084 corp: 22/1320b lim: 85 exec/s: 43 rss: 70Mb L: 78/83 MS: 1 CrossOver- 00:08:29.554 [2024-11-18 19:06:47.950330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.554 [2024-11-18 19:06:47.950357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.950394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.554 [2024-11-18 19:06:47.950409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.950462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.554 [2024-11-18 19:06:47.950478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.554 #44 NEW cov: 11886 ft: 15103 corp: 23/1379b lim: 85 exec/s: 44 rss: 70Mb L: 59/83 MS: 1 EraseBytes- 00:08:29.554 [2024-11-18 19:06:47.990566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.554 [2024-11-18 19:06:47.990593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.990635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.554 [2024-11-18 19:06:47.990650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.990701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.554 [2024-11-18 19:06:47.990720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.554 [2024-11-18 19:06:47.990772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.555 [2024-11-18 19:06:47.990787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.555 #45 NEW cov: 11886 ft: 15141 corp: 24/1459b lim: 85 exec/s: 45 rss: 70Mb L: 80/83 MS: 1 ChangeByte- 00:08:29.555 [2024-11-18 19:06:48.030700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.555 [2024-11-18 19:06:48.030726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.555 [2024-11-18 19:06:48.030779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.555 [2024-11-18 19:06:48.030798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.555 [2024-11-18 19:06:48.030850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.555 [2024-11-18 19:06:48.030866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.555 [2024-11-18 19:06:48.030919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.555 [2024-11-18 19:06:48.030932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.555 #46 NEW cov: 11886 ft: 15147 corp: 25/1537b lim: 85 exec/s: 46 rss: 70Mb L: 78/83 MS: 1 ChangeBit- 00:08:29.555 [2024-11-18 19:06:48.070683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.555 [2024-11-18 19:06:48.070709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.555 [2024-11-18 19:06:48.070746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.555 [2024-11-18 19:06:48.070761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.555 [2024-11-18 19:06:48.070814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.555 [2024-11-18 19:06:48.070829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.555 #47 NEW cov: 11886 ft: 15158 corp: 26/1600b lim: 85 exec/s: 47 rss: 70Mb L: 63/83 MS: 1 ChangeBinInt- 00:08:29.555 [2024-11-18 19:06:48.110780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.555 [2024-11-18 19:06:48.110806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.555 [2024-11-18 19:06:48.110843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.555 [2024-11-18 19:06:48.110858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.555 [2024-11-18 19:06:48.110911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.555 [2024-11-18 19:06:48.110927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.555 #48 NEW cov: 11886 ft: 15176 corp: 27/1663b lim: 85 exec/s: 48 rss: 70Mb L: 63/83 MS: 1 CMP- DE: "\000\000\177\301T\000m\353"- 00:08:29.555 [2024-11-18 19:06:48.150640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.555 [2024-11-18 19:06:48.150666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.814 #49 NEW cov: 11886 ft: 15228 corp: 28/1687b lim: 85 exec/s: 49 rss: 70Mb L: 24/83 MS: 1 EraseBytes- 00:08:29.814 [2024-11-18 19:06:48.191172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.814 [2024-11-18 19:06:48.191199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.814 [2024-11-18 19:06:48.191244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.814 [2024-11-18 19:06:48.191259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.191313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.815 [2024-11-18 19:06:48.191328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.191382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.815 [2024-11-18 19:06:48.191398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.815 #50 NEW cov: 11886 ft: 15267 corp: 29/1764b lim: 85 exec/s: 50 rss: 70Mb L: 77/83 MS: 1 InsertByte- 00:08:29.815 [2024-11-18 19:06:48.231151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.815 [2024-11-18 19:06:48.231178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.231224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.815 [2024-11-18 19:06:48.231240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.231292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.815 [2024-11-18 19:06:48.231308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.815 #51 NEW cov: 11886 ft: 15273 corp: 30/1827b lim: 85 exec/s: 51 rss: 70Mb L: 63/83 MS: 1 ChangeByte- 00:08:29.815 [2024-11-18 19:06:48.271367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.815 [2024-11-18 19:06:48.271394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.271440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.815 [2024-11-18 19:06:48.271455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.271506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.815 [2024-11-18 19:06:48.271521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.271577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.815 [2024-11-18 19:06:48.271593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.815 #52 NEW cov: 11886 ft: 15308 corp: 31/1907b lim: 85 exec/s: 52 rss: 70Mb L: 80/83 MS: 1 ChangeBinInt- 00:08:29.815 [2024-11-18 19:06:48.301481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.815 [2024-11-18 19:06:48.301507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.301551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.815 [2024-11-18 19:06:48.301567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.301618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.815 [2024-11-18 19:06:48.301633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.301686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.815 [2024-11-18 19:06:48.301701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.815 #53 NEW cov: 11886 ft: 15322 corp: 32/1987b lim: 85 exec/s: 53 rss: 70Mb L: 80/83 MS: 1 ChangeByte- 00:08:29.815 [2024-11-18 19:06:48.341487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.815 [2024-11-18 19:06:48.341513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.341554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.815 [2024-11-18 19:06:48.341570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.341626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.815 [2024-11-18 19:06:48.341643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.815 #54 NEW cov: 11886 ft: 15326 corp: 33/2051b lim: 85 exec/s: 54 rss: 70Mb L: 64/83 MS: 1 InsertByte- 00:08:29.815 [2024-11-18 19:06:48.381707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.815 [2024-11-18 19:06:48.381733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.381768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.815 [2024-11-18 19:06:48.381784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.381835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.815 [2024-11-18 19:06:48.381851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.815 [2024-11-18 19:06:48.381905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.815 [2024-11-18 19:06:48.381921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.815 #55 NEW cov: 11893 ft: 15353 corp: 34/2134b lim: 85 exec/s: 55 rss: 70Mb L: 83/83 MS: 1 CrossOver- 00:08:30.075 [2024-11-18 19:06:48.421429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.075 [2024-11-18 19:06:48.421455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.075 #56 NEW cov: 11893 ft: 15399 corp: 35/2152b lim: 85 exec/s: 56 rss: 70Mb L: 18/83 MS: 1 EraseBytes- 00:08:30.075 [2024-11-18 19:06:48.461965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.075 [2024-11-18 19:06:48.461992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.075 [2024-11-18 19:06:48.462037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.075 [2024-11-18 19:06:48.462054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.075 [2024-11-18 19:06:48.462104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.075 [2024-11-18 19:06:48.462119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.075 [2024-11-18 19:06:48.462162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.075 [2024-11-18 19:06:48.462178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.075 #57 NEW cov: 11893 ft: 15422 corp: 36/2232b lim: 85 exec/s: 28 rss: 70Mb L: 80/83 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:30.075 #57 DONE cov: 11893 ft: 15422 corp: 36/2232b lim: 85 exec/s: 28 rss: 70Mb 00:08:30.075 ###### Recommended dictionary. ###### 00:08:30.075 "\001\000" # Uses: 2 00:08:30.075 "\000\000\177\301T\000m\353" # Uses: 0 00:08:30.075 ###### End of recommended dictionary. ###### 00:08:30.075 Done 57 runs in 2 second(s) 00:08:30.075 19:06:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:30.075 19:06:48 -- ../common.sh@72 -- # (( i++ )) 00:08:30.075 19:06:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.075 19:06:48 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:30.075 19:06:48 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:30.075 19:06:48 -- nvmf/run.sh@24 -- # local timen=1 00:08:30.075 19:06:48 -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.075 19:06:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.075 19:06:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:30.075 19:06:48 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:30.075 19:06:48 -- nvmf/run.sh@29 -- # port=4423 00:08:30.075 19:06:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.075 19:06:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:30.075 19:06:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.075 19:06:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:30.075 [2024-11-18 19:06:48.643347] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:30.075 [2024-11-18 19:06:48.643414] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1309508 ] 00:08:30.075 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.334 [2024-11-18 19:06:48.825545] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.334 [2024-11-18 19:06:48.889435] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.334 [2024-11-18 19:06:48.889576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.593 [2024-11-18 19:06:48.947589] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.593 [2024-11-18 19:06:48.963920] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:30.593 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.593 INFO: Seed: 1826299828 00:08:30.593 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:30.593 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:30.593 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.593 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.593 #2 INITED exec/s: 0 rss: 60Mb 00:08:30.593 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.593 This may also happen if the target rejected all inputs we tried so far 00:08:30.593 [2024-11-18 19:06:49.029583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.593 [2024-11-18 19:06:49.029622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.851 NEW_FUNC[1/671]: 0x465118 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:30.851 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:30.851 #6 NEW cov: 11598 ft: 11599 corp: 2/6b lim: 25 exec/s: 0 rss: 68Mb L: 5/5 MS: 4 ChangeBit-InsertByte-CopyPart-CrossOver- 00:08:30.851 [2024-11-18 19:06:49.360526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.851 [2024-11-18 19:06:49.360605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.851 #7 NEW cov: 11711 ft: 12049 corp: 3/12b lim: 25 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 InsertByte- 00:08:30.852 [2024-11-18 19:06:49.410489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.852 [2024-11-18 19:06:49.410518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.852 #13 NEW cov: 11717 ft: 12492 corp: 4/18b lim: 25 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:30.852 [2024-11-18 19:06:49.450605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.852 [2024-11-18 19:06:49.450632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.110 #16 NEW cov: 11802 ft: 12760 corp: 5/25b lim: 25 exec/s: 0 rss: 69Mb L: 7/7 MS: 3 ChangeBit-ShuffleBytes-CrossOver- 00:08:31.110 [2024-11-18 19:06:49.490697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.110 [2024-11-18 19:06:49.490730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.110 #17 NEW cov: 11802 ft: 12862 corp: 6/33b lim: 25 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 InsertByte- 00:08:31.110 [2024-11-18 19:06:49.530888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.110 [2024-11-18 19:06:49.530918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.110 #18 NEW cov: 11802 ft: 12991 corp: 7/41b lim: 25 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 ChangeBit- 00:08:31.110 [2024-11-18 19:06:49.570983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.110 [2024-11-18 19:06:49.571009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.110 #24 NEW cov: 11802 ft: 13076 corp: 8/48b lim: 25 exec/s: 0 rss: 69Mb L: 7/8 MS: 1 ChangeBit- 00:08:31.110 [2024-11-18 19:06:49.611077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.110 [2024-11-18 19:06:49.611102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.110 #25 NEW cov: 11802 ft: 13105 corp: 9/56b lim: 25 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 CopyPart- 00:08:31.110 [2024-11-18 19:06:49.651130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.110 [2024-11-18 19:06:49.651164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.110 #26 NEW cov: 11802 ft: 13168 corp: 10/62b lim: 25 exec/s: 0 rss: 69Mb L: 6/8 MS: 1 ShuffleBytes- 00:08:31.110 [2024-11-18 19:06:49.681209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.110 [2024-11-18 19:06:49.681237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.110 #27 NEW cov: 11802 ft: 13185 corp: 11/67b lim: 25 exec/s: 0 rss: 69Mb L: 5/8 MS: 1 EraseBytes- 00:08:31.369 [2024-11-18 19:06:49.721335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.369 [2024-11-18 19:06:49.721369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.369 #28 NEW cov: 11802 ft: 13214 corp: 12/75b lim: 25 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 ChangeBit- 00:08:31.369 [2024-11-18 19:06:49.761640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.369 [2024-11-18 19:06:49.761674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.369 [2024-11-18 19:06:49.761809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.369 [2024-11-18 19:06:49.761831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.369 #29 NEW cov: 11802 ft: 13598 corp: 13/86b lim: 25 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:08:31.369 [2024-11-18 19:06:49.811896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.369 [2024-11-18 19:06:49.811930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.369 [2024-11-18 19:06:49.812049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.369 [2024-11-18 19:06:49.812075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.369 #30 NEW cov: 11802 ft: 13652 corp: 14/97b lim: 25 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 ChangeBinInt- 00:08:31.369 [2024-11-18 19:06:49.862022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.369 [2024-11-18 19:06:49.862051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.369 [2024-11-18 19:06:49.862154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.369 [2024-11-18 19:06:49.862174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.369 #31 NEW cov: 11802 ft: 13670 corp: 15/109b lim: 25 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:08:31.369 [2024-11-18 19:06:49.901956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.369 [2024-11-18 19:06:49.901981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.369 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.369 #32 NEW cov: 11825 ft: 13724 corp: 16/117b lim: 25 exec/s: 0 rss: 70Mb L: 8/12 MS: 1 ChangeBit- 00:08:31.369 [2024-11-18 19:06:49.942352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.369 [2024-11-18 19:06:49.942381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.369 [2024-11-18 19:06:49.942504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.369 [2024-11-18 19:06:49.942523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.369 #33 NEW cov: 11825 ft: 13745 corp: 17/127b lim: 25 exec/s: 0 rss: 70Mb L: 10/12 MS: 1 CrossOver- 00:08:31.628 [2024-11-18 19:06:49.982211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.628 [2024-11-18 19:06:49.982240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.628 #34 NEW cov: 11825 ft: 13811 corp: 18/133b lim: 25 exec/s: 34 rss: 70Mb L: 6/12 MS: 1 EraseBytes- 00:08:31.628 [2024-11-18 19:06:50.022600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.628 [2024-11-18 19:06:50.022627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.628 [2024-11-18 19:06:50.022752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.628 [2024-11-18 19:06:50.022777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.628 #35 NEW cov: 11825 ft: 13840 corp: 19/145b lim: 25 exec/s: 35 rss: 70Mb L: 12/12 MS: 1 CrossOver- 00:08:31.628 [2024-11-18 19:06:50.072555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.628 [2024-11-18 19:06:50.072588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.628 #36 NEW cov: 11825 ft: 13845 corp: 20/151b lim: 25 exec/s: 36 rss: 70Mb L: 6/12 MS: 1 CopyPart- 00:08:31.628 [2024-11-18 19:06:50.112640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.628 [2024-11-18 19:06:50.112673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.628 #37 NEW cov: 11825 ft: 13863 corp: 21/159b lim: 25 exec/s: 37 rss: 70Mb L: 8/12 MS: 1 ChangeByte- 00:08:31.628 [2024-11-18 19:06:50.152968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.628 [2024-11-18 19:06:50.152999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.628 [2024-11-18 19:06:50.153113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.628 [2024-11-18 19:06:50.153133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.628 #38 NEW cov: 11825 ft: 13875 corp: 22/170b lim: 25 exec/s: 38 rss: 70Mb L: 11/12 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:31.628 [2024-11-18 19:06:50.192841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.628 [2024-11-18 19:06:50.192867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.628 #39 NEW cov: 11825 ft: 13888 corp: 23/177b lim: 25 exec/s: 39 rss: 70Mb L: 7/12 MS: 1 ChangeByte- 00:08:31.891 [2024-11-18 19:06:50.233000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.891 [2024-11-18 19:06:50.233026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.891 #40 NEW cov: 11825 ft: 13959 corp: 24/183b lim: 25 exec/s: 40 rss: 70Mb L: 6/12 MS: 1 ShuffleBytes- 00:08:31.891 [2024-11-18 19:06:50.273064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.891 [2024-11-18 19:06:50.273096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.891 #41 NEW cov: 11825 ft: 13977 corp: 25/191b lim: 25 exec/s: 41 rss: 70Mb L: 8/12 MS: 1 CMP- DE: "\377\377"- 00:08:31.891 [2024-11-18 19:06:50.313467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.891 [2024-11-18 19:06:50.313513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.891 [2024-11-18 19:06:50.313628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.891 [2024-11-18 19:06:50.313654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.891 #42 NEW cov: 11825 ft: 13985 corp: 26/201b lim: 25 exec/s: 42 rss: 70Mb L: 10/12 MS: 1 ChangeByte- 00:08:31.891 [2024-11-18 19:06:50.353297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.891 [2024-11-18 19:06:50.353329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.891 #48 NEW cov: 11825 ft: 14036 corp: 27/209b lim: 25 exec/s: 48 rss: 70Mb L: 8/12 MS: 1 ShuffleBytes- 00:08:31.891 [2024-11-18 19:06:50.393742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.891 [2024-11-18 19:06:50.393773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.891 [2024-11-18 19:06:50.393889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.891 [2024-11-18 19:06:50.393923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.891 #49 NEW cov: 11825 ft: 14112 corp: 28/222b lim: 25 exec/s: 49 rss: 70Mb L: 13/13 MS: 1 InsertByte- 00:08:31.891 [2024-11-18 19:06:50.443940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.891 [2024-11-18 19:06:50.443972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.891 [2024-11-18 19:06:50.444069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.891 [2024-11-18 19:06:50.444088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.891 #50 NEW cov: 11825 ft: 14193 corp: 29/234b lim: 25 exec/s: 50 rss: 70Mb L: 12/13 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:31.891 [2024-11-18 19:06:50.483867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.891 [2024-11-18 19:06:50.483893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 #51 NEW cov: 11825 ft: 14220 corp: 30/239b lim: 25 exec/s: 51 rss: 70Mb L: 5/13 MS: 1 EraseBytes- 00:08:32.253 [2024-11-18 19:06:50.523920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.253 [2024-11-18 19:06:50.523951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 #52 NEW cov: 11825 ft: 14221 corp: 31/245b lim: 25 exec/s: 52 rss: 70Mb L: 6/13 MS: 1 ChangeByte- 00:08:32.253 [2024-11-18 19:06:50.563990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.253 [2024-11-18 19:06:50.564016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 #53 NEW cov: 11825 ft: 14226 corp: 32/252b lim: 25 exec/s: 53 rss: 70Mb L: 7/13 MS: 1 CopyPart- 00:08:32.253 [2024-11-18 19:06:50.604416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.253 [2024-11-18 19:06:50.604448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 [2024-11-18 19:06:50.604559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.253 [2024-11-18 19:06:50.604582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.253 #54 NEW cov: 11825 ft: 14241 corp: 33/262b lim: 25 exec/s: 54 rss: 70Mb L: 10/13 MS: 1 ShuffleBytes- 00:08:32.253 [2024-11-18 19:06:50.644890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.253 [2024-11-18 19:06:50.644922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 [2024-11-18 19:06:50.645020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.253 [2024-11-18 19:06:50.645046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.253 [2024-11-18 19:06:50.645163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.253 [2024-11-18 19:06:50.645186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.253 [2024-11-18 19:06:50.645302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.253 [2024-11-18 19:06:50.645326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.253 #55 NEW cov: 11825 ft: 14735 corp: 34/285b lim: 25 exec/s: 55 rss: 70Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:32.253 [2024-11-18 19:06:50.694730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.253 [2024-11-18 19:06:50.694766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 [2024-11-18 19:06:50.694893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.253 [2024-11-18 19:06:50.694916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.253 #56 NEW cov: 11825 ft: 14739 corp: 35/297b lim: 25 exec/s: 56 rss: 70Mb L: 12/23 MS: 1 CrossOver- 00:08:32.253 [2024-11-18 19:06:50.744681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.253 [2024-11-18 19:06:50.744707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 #57 NEW cov: 11825 ft: 14742 corp: 36/303b lim: 25 exec/s: 57 rss: 70Mb L: 6/23 MS: 1 InsertByte- 00:08:32.253 [2024-11-18 19:06:50.784883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.253 [2024-11-18 19:06:50.784915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.253 [2024-11-18 19:06:50.785042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.253 [2024-11-18 19:06:50.785065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.253 #58 NEW cov: 11825 ft: 14755 corp: 37/314b lim: 25 exec/s: 58 rss: 70Mb L: 11/23 MS: 1 ChangeBit- 00:08:32.548 [2024-11-18 19:06:50.835157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.549 [2024-11-18 19:06:50.835190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:50.835277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.549 [2024-11-18 19:06:50.835302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:50.875179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.549 [2024-11-18 19:06:50.875207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:50.875343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.549 [2024-11-18 19:06:50.875367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.549 #60 NEW cov: 11825 ft: 14770 corp: 38/326b lim: 25 exec/s: 60 rss: 70Mb L: 12/23 MS: 2 PersAutoDict-ChangeBit- DE: "\377\377"- 00:08:32.549 [2024-11-18 19:06:50.915074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.549 [2024-11-18 19:06:50.915101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.549 #61 NEW cov: 11825 ft: 14777 corp: 39/334b lim: 25 exec/s: 61 rss: 70Mb L: 8/23 MS: 1 CopyPart- 00:08:32.549 [2024-11-18 19:06:50.955202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.549 [2024-11-18 19:06:50.955250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.549 #62 NEW cov: 11825 ft: 14781 corp: 40/340b lim: 25 exec/s: 62 rss: 70Mb L: 6/23 MS: 1 EraseBytes- 00:08:32.549 [2024-11-18 19:06:50.995764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.549 [2024-11-18 19:06:50.995797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:50.995915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.549 [2024-11-18 19:06:50.995937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:50.996082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.549 [2024-11-18 19:06:50.996104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:50.996222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.549 [2024-11-18 19:06:50.996242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:51.046094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.549 [2024-11-18 19:06:51.046123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:51.046226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.549 [2024-11-18 19:06:51.046246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:51.046367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.549 [2024-11-18 19:06:51.046391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.549 [2024-11-18 19:06:51.046511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.549 [2024-11-18 19:06:51.046532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.549 #69 NEW cov: 11825 ft: 14822 corp: 41/361b lim: 25 exec/s: 34 rss: 70Mb L: 21/23 MS: 2 InsertRepeatedBytes-CopyPart- 00:08:32.549 #69 DONE cov: 11825 ft: 14822 corp: 41/361b lim: 25 exec/s: 34 rss: 70Mb 00:08:32.549 ###### Recommended dictionary. ###### 00:08:32.549 "\003\000\000\000\000\000\000\000" # Uses: 0 00:08:32.549 "\377\377" # Uses: 2 00:08:32.549 ###### End of recommended dictionary. ###### 00:08:32.549 Done 69 runs in 2 second(s) 00:08:32.809 19:06:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:32.809 19:06:51 -- ../common.sh@72 -- # (( i++ )) 00:08:32.809 19:06:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.809 19:06:51 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:32.809 19:06:51 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:32.809 19:06:51 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.809 19:06:51 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.809 19:06:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.809 19:06:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:32.809 19:06:51 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:32.809 19:06:51 -- nvmf/run.sh@29 -- # port=4424 00:08:32.809 19:06:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.809 19:06:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:32.809 19:06:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.809 19:06:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:32.809 [2024-11-18 19:06:51.232152] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:32.809 [2024-11-18 19:06:51.232215] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1310057 ] 00:08:32.809 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.809 [2024-11-18 19:06:51.407960] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.069 [2024-11-18 19:06:51.471498] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:33.069 [2024-11-18 19:06:51.471621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.069 [2024-11-18 19:06:51.529568] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.069 [2024-11-18 19:06:51.545836] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:33.069 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.069 INFO: Seed: 114326402 00:08:33.069 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:33.069 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:33.069 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.069 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.069 #2 INITED exec/s: 0 rss: 60Mb 00:08:33.069 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.069 This may also happen if the target rejected all inputs we tried so far 00:08:33.069 [2024-11-18 19:06:51.590429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179306496 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.069 [2024-11-18 19:06:51.590462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.329 NEW_FUNC[1/672]: 0x466208 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:33.329 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.329 #7 NEW cov: 11670 ft: 11671 corp: 2/22b lim: 100 exec/s: 0 rss: 68Mb L: 21/21 MS: 5 InsertByte-CopyPart-ChangeByte-InsertRepeatedBytes-CopyPart- 00:08:33.329 [2024-11-18 19:06:51.911246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179306496 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.329 [2024-11-18 19:06:51.911284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.589 #8 NEW cov: 11783 ft: 12123 corp: 3/42b lim: 100 exec/s: 0 rss: 69Mb L: 20/21 MS: 1 EraseBytes- 00:08:33.589 [2024-11-18 19:06:51.981319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179306496 len:22 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.589 [2024-11-18 19:06:51.981352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.589 #9 NEW cov: 11789 ft: 12374 corp: 4/63b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 InsertByte- 00:08:33.589 [2024-11-18 19:06:52.051476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179634176 len:22 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.589 [2024-11-18 19:06:52.051508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.589 #10 NEW cov: 11874 ft: 12672 corp: 5/84b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 ChangeBinInt- 00:08:33.589 [2024-11-18 19:06:52.121661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:21 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.589 [2024-11-18 19:06:52.121692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.589 #11 NEW cov: 11874 ft: 12748 corp: 6/105b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 ChangeBinInt- 00:08:33.589 [2024-11-18 19:06:52.171803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:21 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.589 [2024-11-18 19:06:52.171834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.848 #12 NEW cov: 11874 ft: 12954 corp: 7/126b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 ChangeByte- 00:08:33.848 [2024-11-18 19:06:52.241982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179372032 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.848 [2024-11-18 19:06:52.242012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.848 #13 NEW cov: 11874 ft: 13053 corp: 8/146b lim: 100 exec/s: 0 rss: 69Mb L: 20/21 MS: 1 ChangeBit- 00:08:33.848 [2024-11-18 19:06:52.292077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16607023805562880 len:22 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.848 [2024-11-18 19:06:52.292107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.848 #14 NEW cov: 11874 ft: 13096 corp: 9/167b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 ChangeByte- 00:08:33.848 [2024-11-18 19:06:52.352259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179306496 len:3585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.848 [2024-11-18 19:06:52.352289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.848 #22 NEW cov: 11874 ft: 13167 corp: 10/188b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 3 EraseBytes-ChangeByte-CMP- DE: "\016\000\000\000"- 00:08:33.848 [2024-11-18 19:06:52.402391] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179306496 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.848 [2024-11-18 19:06:52.402422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.848 #23 NEW cov: 11874 ft: 13225 corp: 11/209b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 ShuffleBytes- 00:08:34.108 [2024-11-18 19:06:52.452526] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:17873512866229747289 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.108 [2024-11-18 19:06:52.452563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.108 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.108 #24 NEW cov: 11891 ft: 13259 corp: 12/229b lim: 100 exec/s: 0 rss: 69Mb L: 20/21 MS: 1 CMP- DE: "\377\377~Y\370\013y@"- 00:08:34.108 [2024-11-18 19:06:52.522703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179372032 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.108 [2024-11-18 19:06:52.522733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.108 #25 NEW cov: 11891 ft: 13313 corp: 13/249b lim: 100 exec/s: 0 rss: 69Mb L: 20/21 MS: 1 ShuffleBytes- 00:08:34.108 [2024-11-18 19:06:52.572817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4611686018427387925 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.108 [2024-11-18 19:06:52.572846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.108 #26 NEW cov: 11891 ft: 13398 corp: 14/270b lim: 100 exec/s: 26 rss: 70Mb L: 21/21 MS: 1 ChangeBit- 00:08:34.108 [2024-11-18 19:06:52.643004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179306496 len:22 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.108 [2024-11-18 19:06:52.643033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.108 #27 NEW cov: 11891 ft: 13415 corp: 15/291b lim: 100 exec/s: 27 rss: 70Mb L: 21/21 MS: 1 ChangeBinInt- 00:08:34.108 [2024-11-18 19:06:52.693214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13382931975044184505 len:47546 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.108 [2024-11-18 19:06:52.693244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.108 [2024-11-18 19:06:52.693278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13382931975044184505 len:47546 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.108 [2024-11-18 19:06:52.693296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.369 #31 NEW cov: 11891 ft: 14228 corp: 16/341b lim: 100 exec/s: 31 rss: 70Mb L: 50/50 MS: 4 InsertByte-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:34.369 [2024-11-18 19:06:52.753282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15393162788885 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.369 [2024-11-18 19:06:52.753311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.369 #32 NEW cov: 11891 ft: 14251 corp: 17/362b lim: 100 exec/s: 32 rss: 70Mb L: 21/50 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:08:34.369 [2024-11-18 19:06:52.803416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179634176 len:22 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.369 [2024-11-18 19:06:52.803445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.369 #33 NEW cov: 11891 ft: 14299 corp: 18/383b lim: 100 exec/s: 33 rss: 70Mb L: 21/50 MS: 1 CMP- DE: "\374E\017\370Y\177\000\000"- 00:08:34.369 [2024-11-18 19:06:52.853673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13382931975044184505 len:47546 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.369 [2024-11-18 19:06:52.853701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.369 [2024-11-18 19:06:52.853733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.369 [2024-11-18 19:06:52.853750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.369 [2024-11-18 19:06:52.853778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13382931975044184505 len:47546 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.369 [2024-11-18 19:06:52.853794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.369 #34 NEW cov: 11891 ft: 14691 corp: 19/455b lim: 100 exec/s: 34 rss: 70Mb L: 72/72 MS: 1 InsertRepeatedBytes- 00:08:34.369 [2024-11-18 19:06:52.923745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446601519450030101 len:31041 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.369 [2024-11-18 19:06:52.923778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.369 #35 NEW cov: 11891 ft: 14704 corp: 20/476b lim: 100 exec/s: 35 rss: 70Mb L: 21/72 MS: 1 PersAutoDict- DE: "\377\377~Y\370\013y@"- 00:08:34.628 [2024-11-18 19:06:52.973905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:159093161984 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.628 [2024-11-18 19:06:52.973936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 #36 NEW cov: 11891 ft: 14716 corp: 21/497b lim: 100 exec/s: 36 rss: 70Mb L: 21/72 MS: 1 InsertByte- 00:08:34.628 [2024-11-18 19:06:53.024111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:5208492443301333064 len:18505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.628 [2024-11-18 19:06:53.024140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.628 [2024-11-18 19:06:53.024171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:5208492444341520456 len:18505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.628 [2024-11-18 19:06:53.024204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.628 [2024-11-18 19:06:53.024234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3036676096 len:5377 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.628 [2024-11-18 19:06:53.024250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.629 #37 NEW cov: 11891 ft: 14735 corp: 22/557b lim: 100 exec/s: 37 rss: 70Mb L: 60/72 MS: 1 InsertRepeatedBytes- 00:08:34.629 [2024-11-18 19:06:53.084153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179306496 len:236 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.629 [2024-11-18 19:06:53.084182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.629 #38 NEW cov: 11891 ft: 14748 corp: 23/578b lim: 100 exec/s: 38 rss: 70Mb L: 21/72 MS: 1 ChangeBinInt- 00:08:34.629 [2024-11-18 19:06:53.144314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:241 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.629 [2024-11-18 19:06:53.144343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.629 #39 NEW cov: 11891 ft: 14769 corp: 24/599b lim: 100 exec/s: 39 rss: 70Mb L: 21/72 MS: 1 ChangeBinInt- 00:08:34.629 [2024-11-18 19:06:53.194469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446601137197940757 len:2881 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.629 [2024-11-18 19:06:53.194500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 #40 NEW cov: 11891 ft: 14785 corp: 25/620b lim: 100 exec/s: 40 rss: 70Mb L: 21/72 MS: 1 ShuffleBytes- 00:08:34.888 [2024-11-18 19:06:53.264675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446601137198071829 len:2881 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.888 [2024-11-18 19:06:53.264706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 #41 NEW cov: 11891 ft: 14825 corp: 26/641b lim: 100 exec/s: 41 rss: 70Mb L: 21/72 MS: 1 ChangeBit- 00:08:34.888 [2024-11-18 19:06:53.334853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179372032 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.888 [2024-11-18 19:06:53.334883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 #42 NEW cov: 11891 ft: 14856 corp: 27/664b lim: 100 exec/s: 42 rss: 70Mb L: 23/72 MS: 1 CrossOver- 00:08:34.888 [2024-11-18 19:06:53.395007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179372032 len:5377 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.888 [2024-11-18 19:06:53.395037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 #43 NEW cov: 11891 ft: 14891 corp: 28/685b lim: 100 exec/s: 43 rss: 70Mb L: 21/72 MS: 1 ChangeBinInt- 00:08:34.888 [2024-11-18 19:06:53.455370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:56013657821151488 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.888 [2024-11-18 19:06:53.455402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.888 [2024-11-18 19:06:53.455436] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.888 [2024-11-18 19:06:53.455454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.888 [2024-11-18 19:06:53.455485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.888 [2024-11-18 19:06:53.455502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.888 [2024-11-18 19:06:53.455531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.888 [2024-11-18 19:06:53.455557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.148 #48 NEW cov: 11898 ft: 15273 corp: 29/780b lim: 100 exec/s: 48 rss: 70Mb L: 95/95 MS: 5 CMP-ChangeBit-InsertByte-ChangeBit-InsertRepeatedBytes- DE: "\001\000\000\000\000\000\000\000"- 00:08:35.148 [2024-11-18 19:06:53.515365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:179372032 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.148 [2024-11-18 19:06:53.515395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.148 [2024-11-18 19:06:53.515427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7595718147998050665 len:26986 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.148 [2024-11-18 19:06:53.515444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.148 #49 NEW cov: 11898 ft: 15277 corp: 30/825b lim: 100 exec/s: 49 rss: 70Mb L: 45/95 MS: 1 InsertRepeatedBytes- 00:08:35.148 [2024-11-18 19:06:53.585484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4611967489109131285 len:63500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.148 [2024-11-18 19:06:53.585513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.148 #50 NEW cov: 11898 ft: 15278 corp: 31/846b lim: 100 exec/s: 25 rss: 70Mb L: 21/95 MS: 1 PersAutoDict- DE: "\377\377~Y\370\013y@"- 00:08:35.148 #50 DONE cov: 11898 ft: 15278 corp: 31/846b lim: 100 exec/s: 25 rss: 70Mb 00:08:35.148 ###### Recommended dictionary. ###### 00:08:35.148 "\016\000\000\000" # Uses: 1 00:08:35.148 "\377\377~Y\370\013y@" # Uses: 2 00:08:35.148 "\374E\017\370Y\177\000\000" # Uses: 0 00:08:35.148 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:35.148 ###### End of recommended dictionary. ###### 00:08:35.148 Done 50 runs in 2 second(s) 00:08:35.408 19:06:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:35.408 19:06:53 -- ../common.sh@72 -- # (( i++ )) 00:08:35.408 19:06:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.408 19:06:53 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:35.408 00:08:35.408 real 1m4.947s 00:08:35.408 user 1m40.646s 00:08:35.408 sys 0m7.913s 00:08:35.408 19:06:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:35.408 19:06:53 -- common/autotest_common.sh@10 -- # set +x 00:08:35.408 ************************************ 00:08:35.408 END TEST nvmf_fuzz 00:08:35.408 ************************************ 00:08:35.408 19:06:53 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:35.408 19:06:53 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:35.408 19:06:53 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:35.408 19:06:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:35.408 19:06:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:35.408 19:06:53 -- common/autotest_common.sh@10 -- # set +x 00:08:35.408 ************************************ 00:08:35.408 START TEST vfio_fuzz 00:08:35.408 ************************************ 00:08:35.408 19:06:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:35.408 * Looking for test storage... 00:08:35.408 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.408 19:06:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:35.408 19:06:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:35.408 19:06:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:35.408 19:06:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:35.408 19:06:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:35.408 19:06:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:35.408 19:06:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:35.408 19:06:53 -- scripts/common.sh@335 -- # IFS=.-: 00:08:35.408 19:06:53 -- scripts/common.sh@335 -- # read -ra ver1 00:08:35.408 19:06:53 -- scripts/common.sh@336 -- # IFS=.-: 00:08:35.408 19:06:53 -- scripts/common.sh@336 -- # read -ra ver2 00:08:35.408 19:06:53 -- scripts/common.sh@337 -- # local 'op=<' 00:08:35.408 19:06:53 -- scripts/common.sh@339 -- # ver1_l=2 00:08:35.408 19:06:53 -- scripts/common.sh@340 -- # ver2_l=1 00:08:35.408 19:06:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:35.408 19:06:53 -- scripts/common.sh@343 -- # case "$op" in 00:08:35.408 19:06:53 -- scripts/common.sh@344 -- # : 1 00:08:35.408 19:06:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:35.408 19:06:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:35.408 19:06:53 -- scripts/common.sh@364 -- # decimal 1 00:08:35.408 19:06:53 -- scripts/common.sh@352 -- # local d=1 00:08:35.408 19:06:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:35.408 19:06:53 -- scripts/common.sh@354 -- # echo 1 00:08:35.408 19:06:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:35.408 19:06:53 -- scripts/common.sh@365 -- # decimal 2 00:08:35.409 19:06:53 -- scripts/common.sh@352 -- # local d=2 00:08:35.409 19:06:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:35.409 19:06:53 -- scripts/common.sh@354 -- # echo 2 00:08:35.409 19:06:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:35.409 19:06:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:35.409 19:06:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:35.409 19:06:53 -- scripts/common.sh@367 -- # return 0 00:08:35.409 19:06:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:35.409 19:06:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:35.409 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.409 --rc genhtml_branch_coverage=1 00:08:35.409 --rc genhtml_function_coverage=1 00:08:35.409 --rc genhtml_legend=1 00:08:35.409 --rc geninfo_all_blocks=1 00:08:35.409 --rc geninfo_unexecuted_blocks=1 00:08:35.409 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.409 ' 00:08:35.409 19:06:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:35.409 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.409 --rc genhtml_branch_coverage=1 00:08:35.409 --rc genhtml_function_coverage=1 00:08:35.409 --rc genhtml_legend=1 00:08:35.409 --rc geninfo_all_blocks=1 00:08:35.409 --rc geninfo_unexecuted_blocks=1 00:08:35.409 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.409 ' 00:08:35.409 19:06:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:35.409 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.409 --rc genhtml_branch_coverage=1 00:08:35.409 --rc genhtml_function_coverage=1 00:08:35.409 --rc genhtml_legend=1 00:08:35.409 --rc geninfo_all_blocks=1 00:08:35.409 --rc geninfo_unexecuted_blocks=1 00:08:35.409 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.409 ' 00:08:35.409 19:06:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:35.409 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.409 --rc genhtml_branch_coverage=1 00:08:35.409 --rc genhtml_function_coverage=1 00:08:35.409 --rc genhtml_legend=1 00:08:35.409 --rc geninfo_all_blocks=1 00:08:35.409 --rc geninfo_unexecuted_blocks=1 00:08:35.409 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.409 ' 00:08:35.409 19:06:53 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:35.409 19:06:53 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:35.409 19:06:53 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:35.409 19:06:53 -- common/autotest_common.sh@34 -- # set -e 00:08:35.409 19:06:53 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:35.409 19:06:53 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:35.409 19:06:53 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:35.409 19:06:53 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:35.409 19:06:53 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:35.409 19:06:53 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:35.409 19:06:53 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:35.409 19:06:53 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:35.409 19:06:53 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:35.409 19:06:53 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:35.409 19:06:53 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:35.409 19:06:53 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:35.409 19:06:53 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:35.409 19:06:53 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:35.409 19:06:53 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:35.409 19:06:53 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:35.409 19:06:53 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:35.409 19:06:53 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:35.409 19:06:53 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:35.409 19:06:53 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:35.409 19:06:53 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:35.409 19:06:53 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:35.409 19:06:53 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:35.409 19:06:53 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:35.409 19:06:53 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:35.409 19:06:53 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:35.409 19:06:53 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:35.409 19:06:53 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:35.409 19:06:53 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:35.409 19:06:53 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:35.409 19:06:53 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:35.409 19:06:53 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:35.409 19:06:53 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:35.409 19:06:53 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:35.409 19:06:53 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:35.409 19:06:53 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:35.409 19:06:53 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:35.409 19:06:53 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:35.409 19:06:53 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:35.409 19:06:53 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:35.409 19:06:53 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:35.409 19:06:53 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:35.409 19:06:53 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:35.409 19:06:53 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:35.409 19:06:53 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:35.409 19:06:53 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:35.409 19:06:53 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:35.409 19:06:53 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:35.409 19:06:53 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:35.409 19:06:53 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:35.409 19:06:53 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:35.409 19:06:53 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:35.409 19:06:53 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:35.409 19:06:53 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:35.409 19:06:53 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:35.409 19:06:53 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:35.409 19:06:53 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:35.409 19:06:53 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:35.409 19:06:53 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:35.409 19:06:53 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:35.409 19:06:53 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:35.409 19:06:53 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:35.409 19:06:53 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:35.409 19:06:54 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:35.409 19:06:54 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:08:35.409 19:06:54 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:35.409 19:06:54 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:35.409 19:06:54 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:35.409 19:06:54 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:35.409 19:06:54 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:35.409 19:06:54 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:35.409 19:06:54 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:35.409 19:06:54 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:35.409 19:06:54 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:35.409 19:06:54 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:35.409 19:06:54 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:35.409 19:06:54 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:35.409 19:06:54 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:35.409 19:06:54 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:35.409 19:06:54 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:35.409 19:06:54 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:35.409 19:06:54 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:35.409 19:06:54 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:35.409 19:06:54 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:35.409 19:06:54 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:35.671 19:06:54 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:35.671 19:06:54 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:35.671 19:06:54 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:35.671 19:06:54 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.671 19:06:54 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:35.671 19:06:54 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.671 19:06:54 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:35.671 19:06:54 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:35.671 19:06:54 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:35.671 19:06:54 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:35.671 19:06:54 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:35.671 19:06:54 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:35.671 19:06:54 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:35.671 19:06:54 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:35.671 #define SPDK_CONFIG_H 00:08:35.671 #define SPDK_CONFIG_APPS 1 00:08:35.671 #define SPDK_CONFIG_ARCH native 00:08:35.671 #undef SPDK_CONFIG_ASAN 00:08:35.671 #undef SPDK_CONFIG_AVAHI 00:08:35.671 #undef SPDK_CONFIG_CET 00:08:35.671 #define SPDK_CONFIG_COVERAGE 1 00:08:35.671 #define SPDK_CONFIG_CROSS_PREFIX 00:08:35.671 #undef SPDK_CONFIG_CRYPTO 00:08:35.671 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:35.671 #undef SPDK_CONFIG_CUSTOMOCF 00:08:35.671 #undef SPDK_CONFIG_DAOS 00:08:35.671 #define SPDK_CONFIG_DAOS_DIR 00:08:35.671 #define SPDK_CONFIG_DEBUG 1 00:08:35.671 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:35.671 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:35.671 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:35.671 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:35.671 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:35.671 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:35.671 #define SPDK_CONFIG_EXAMPLES 1 00:08:35.671 #undef SPDK_CONFIG_FC 00:08:35.671 #define SPDK_CONFIG_FC_PATH 00:08:35.671 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:35.671 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:35.671 #undef SPDK_CONFIG_FUSE 00:08:35.671 #define SPDK_CONFIG_FUZZER 1 00:08:35.671 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:35.671 #undef SPDK_CONFIG_GOLANG 00:08:35.671 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:35.671 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:35.671 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:35.671 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:35.671 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:35.671 #define SPDK_CONFIG_IDXD 1 00:08:35.671 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:35.671 #undef SPDK_CONFIG_IPSEC_MB 00:08:35.671 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:35.671 #define SPDK_CONFIG_ISAL 1 00:08:35.671 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:35.671 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:35.671 #define SPDK_CONFIG_LIBDIR 00:08:35.671 #undef SPDK_CONFIG_LTO 00:08:35.671 #define SPDK_CONFIG_MAX_LCORES 00:08:35.671 #define SPDK_CONFIG_NVME_CUSE 1 00:08:35.672 #undef SPDK_CONFIG_OCF 00:08:35.672 #define SPDK_CONFIG_OCF_PATH 00:08:35.672 #define SPDK_CONFIG_OPENSSL_PATH 00:08:35.672 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:35.672 #undef SPDK_CONFIG_PGO_USE 00:08:35.672 #define SPDK_CONFIG_PREFIX /usr/local 00:08:35.672 #undef SPDK_CONFIG_RAID5F 00:08:35.672 #undef SPDK_CONFIG_RBD 00:08:35.672 #define SPDK_CONFIG_RDMA 1 00:08:35.672 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:35.672 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:35.672 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:35.672 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:35.672 #undef SPDK_CONFIG_SHARED 00:08:35.672 #undef SPDK_CONFIG_SMA 00:08:35.672 #define SPDK_CONFIG_TESTS 1 00:08:35.672 #undef SPDK_CONFIG_TSAN 00:08:35.672 #define SPDK_CONFIG_UBLK 1 00:08:35.672 #define SPDK_CONFIG_UBSAN 1 00:08:35.672 #undef SPDK_CONFIG_UNIT_TESTS 00:08:35.672 #undef SPDK_CONFIG_URING 00:08:35.672 #define SPDK_CONFIG_URING_PATH 00:08:35.672 #undef SPDK_CONFIG_URING_ZNS 00:08:35.672 #undef SPDK_CONFIG_USDT 00:08:35.672 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:35.672 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:35.672 #define SPDK_CONFIG_VFIO_USER 1 00:08:35.672 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:35.672 #define SPDK_CONFIG_VHOST 1 00:08:35.672 #define SPDK_CONFIG_VIRTIO 1 00:08:35.672 #undef SPDK_CONFIG_VTUNE 00:08:35.672 #define SPDK_CONFIG_VTUNE_DIR 00:08:35.672 #define SPDK_CONFIG_WERROR 1 00:08:35.672 #define SPDK_CONFIG_WPDK_DIR 00:08:35.672 #undef SPDK_CONFIG_XNVME 00:08:35.672 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:35.672 19:06:54 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:35.672 19:06:54 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:35.672 19:06:54 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:35.672 19:06:54 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:35.672 19:06:54 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:35.672 19:06:54 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.672 19:06:54 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.672 19:06:54 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.672 19:06:54 -- paths/export.sh@5 -- # export PATH 00:08:35.672 19:06:54 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.672 19:06:54 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:35.672 19:06:54 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:35.672 19:06:54 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:35.672 19:06:54 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:35.672 19:06:54 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:35.672 19:06:54 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:35.672 19:06:54 -- pm/common@16 -- # TEST_TAG=N/A 00:08:35.672 19:06:54 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:35.672 19:06:54 -- common/autotest_common.sh@52 -- # : 1 00:08:35.672 19:06:54 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:35.672 19:06:54 -- common/autotest_common.sh@56 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:35.672 19:06:54 -- common/autotest_common.sh@58 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:35.672 19:06:54 -- common/autotest_common.sh@60 -- # : 1 00:08:35.672 19:06:54 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:35.672 19:06:54 -- common/autotest_common.sh@62 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:35.672 19:06:54 -- common/autotest_common.sh@64 -- # : 00:08:35.672 19:06:54 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:35.672 19:06:54 -- common/autotest_common.sh@66 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:35.672 19:06:54 -- common/autotest_common.sh@68 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:35.672 19:06:54 -- common/autotest_common.sh@70 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:35.672 19:06:54 -- common/autotest_common.sh@72 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:35.672 19:06:54 -- common/autotest_common.sh@74 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:35.672 19:06:54 -- common/autotest_common.sh@76 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:35.672 19:06:54 -- common/autotest_common.sh@78 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:35.672 19:06:54 -- common/autotest_common.sh@80 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:35.672 19:06:54 -- common/autotest_common.sh@82 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:35.672 19:06:54 -- common/autotest_common.sh@84 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:35.672 19:06:54 -- common/autotest_common.sh@86 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:35.672 19:06:54 -- common/autotest_common.sh@88 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:35.672 19:06:54 -- common/autotest_common.sh@90 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:35.672 19:06:54 -- common/autotest_common.sh@92 -- # : 1 00:08:35.672 19:06:54 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:35.672 19:06:54 -- common/autotest_common.sh@94 -- # : 1 00:08:35.672 19:06:54 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:35.672 19:06:54 -- common/autotest_common.sh@96 -- # : rdma 00:08:35.672 19:06:54 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:35.672 19:06:54 -- common/autotest_common.sh@98 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:35.672 19:06:54 -- common/autotest_common.sh@100 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:35.672 19:06:54 -- common/autotest_common.sh@102 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:35.672 19:06:54 -- common/autotest_common.sh@104 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:35.672 19:06:54 -- common/autotest_common.sh@106 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:35.672 19:06:54 -- common/autotest_common.sh@108 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:35.672 19:06:54 -- common/autotest_common.sh@110 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:35.672 19:06:54 -- common/autotest_common.sh@112 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:35.672 19:06:54 -- common/autotest_common.sh@114 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:35.672 19:06:54 -- common/autotest_common.sh@116 -- # : 1 00:08:35.672 19:06:54 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:35.672 19:06:54 -- common/autotest_common.sh@118 -- # : 00:08:35.672 19:06:54 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:35.672 19:06:54 -- common/autotest_common.sh@120 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:35.672 19:06:54 -- common/autotest_common.sh@122 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:35.672 19:06:54 -- common/autotest_common.sh@124 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:35.672 19:06:54 -- common/autotest_common.sh@126 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:35.672 19:06:54 -- common/autotest_common.sh@128 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:35.672 19:06:54 -- common/autotest_common.sh@130 -- # : 0 00:08:35.672 19:06:54 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:35.672 19:06:54 -- common/autotest_common.sh@132 -- # : 00:08:35.672 19:06:54 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:35.672 19:06:54 -- common/autotest_common.sh@134 -- # : true 00:08:35.672 19:06:54 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:35.672 19:06:54 -- common/autotest_common.sh@136 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:35.673 19:06:54 -- common/autotest_common.sh@138 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:35.673 19:06:54 -- common/autotest_common.sh@140 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:35.673 19:06:54 -- common/autotest_common.sh@142 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:35.673 19:06:54 -- common/autotest_common.sh@144 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:35.673 19:06:54 -- common/autotest_common.sh@146 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:35.673 19:06:54 -- common/autotest_common.sh@148 -- # : 00:08:35.673 19:06:54 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:35.673 19:06:54 -- common/autotest_common.sh@150 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:35.673 19:06:54 -- common/autotest_common.sh@152 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:35.673 19:06:54 -- common/autotest_common.sh@154 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:35.673 19:06:54 -- common/autotest_common.sh@156 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:35.673 19:06:54 -- common/autotest_common.sh@158 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:35.673 19:06:54 -- common/autotest_common.sh@160 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:35.673 19:06:54 -- common/autotest_common.sh@163 -- # : 00:08:35.673 19:06:54 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:35.673 19:06:54 -- common/autotest_common.sh@165 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:35.673 19:06:54 -- common/autotest_common.sh@167 -- # : 0 00:08:35.673 19:06:54 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:35.673 19:06:54 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.673 19:06:54 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:35.673 19:06:54 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:35.673 19:06:54 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:35.673 19:06:54 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:35.673 19:06:54 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:35.673 19:06:54 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:35.673 19:06:54 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:35.673 19:06:54 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:35.673 19:06:54 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:35.673 19:06:54 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:35.673 19:06:54 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:35.673 19:06:54 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:35.673 19:06:54 -- common/autotest_common.sh@196 -- # cat 00:08:35.673 19:06:54 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:35.673 19:06:54 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:35.673 19:06:54 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:35.673 19:06:54 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:35.673 19:06:54 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:35.673 19:06:54 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:35.673 19:06:54 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:35.673 19:06:54 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.673 19:06:54 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.673 19:06:54 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.673 19:06:54 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.673 19:06:54 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:35.673 19:06:54 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:35.673 19:06:54 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:35.673 19:06:54 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:35.673 19:06:54 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:35.673 19:06:54 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:35.673 19:06:54 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:35.673 19:06:54 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:35.673 19:06:54 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:35.673 19:06:54 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:35.673 19:06:54 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:35.673 19:06:54 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:35.673 19:06:54 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:35.673 19:06:54 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:35.673 19:06:54 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:35.673 19:06:54 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:35.673 19:06:54 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:35.673 19:06:54 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:35.673 19:06:54 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:35.673 19:06:54 -- common/autotest_common.sh@259 -- # valgrind= 00:08:35.673 19:06:54 -- common/autotest_common.sh@265 -- # uname -s 00:08:35.673 19:06:54 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:35.673 19:06:54 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:35.673 19:06:54 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:35.673 19:06:54 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:35.673 19:06:54 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:35.673 19:06:54 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:35.673 19:06:54 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:35.673 19:06:54 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:35.673 19:06:54 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:35.673 19:06:54 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:35.673 19:06:54 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:35.673 19:06:54 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:35.673 19:06:54 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:35.673 19:06:54 -- common/autotest_common.sh@319 -- # [[ -z 1310594 ]] 00:08:35.673 19:06:54 -- common/autotest_common.sh@319 -- # kill -0 1310594 00:08:35.673 19:06:54 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:35.673 19:06:54 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:35.674 19:06:54 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:35.674 19:06:54 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:35.674 19:06:54 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:35.674 19:06:54 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:35.674 19:06:54 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:35.674 19:06:54 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.KEGAZZ 00:08:35.674 19:06:54 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:35.674 19:06:54 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.KEGAZZ/tests/vfio /tmp/spdk.KEGAZZ 00:08:35.674 19:06:54 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@328 -- # df -T 00:08:35.674 19:06:54 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:35.674 19:06:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:35.674 19:06:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=54448087040 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730582528 00:08:35.674 19:06:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=7282495488 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864031744 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:08:35.674 19:06:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:08:35.674 19:06:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=30865092608 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865293312 00:08:35.674 19:06:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=200704 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:35.674 19:06:54 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:35.674 19:06:54 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:35.674 19:06:54 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.674 19:06:54 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:35.674 * Looking for test storage... 00:08:35.674 19:06:54 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:35.674 19:06:54 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:35.674 19:06:54 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.674 19:06:54 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:35.674 19:06:54 -- common/autotest_common.sh@373 -- # mount=/ 00:08:35.674 19:06:54 -- common/autotest_common.sh@375 -- # target_space=54448087040 00:08:35.674 19:06:54 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:35.674 19:06:54 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:35.674 19:06:54 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@382 -- # new_size=9497088000 00:08:35.674 19:06:54 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:35.674 19:06:54 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.674 19:06:54 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.674 19:06:54 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.674 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.674 19:06:54 -- common/autotest_common.sh@390 -- # return 0 00:08:35.674 19:06:54 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:35.674 19:06:54 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:35.674 19:06:54 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:35.674 19:06:54 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:35.674 19:06:54 -- common/autotest_common.sh@1682 -- # true 00:08:35.674 19:06:54 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:35.674 19:06:54 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@27 -- # exec 00:08:35.674 19:06:54 -- common/autotest_common.sh@29 -- # exec 00:08:35.674 19:06:54 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:35.674 19:06:54 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:35.674 19:06:54 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:35.674 19:06:54 -- common/autotest_common.sh@18 -- # set -x 00:08:35.674 19:06:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:35.674 19:06:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:35.674 19:06:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:35.674 19:06:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:35.674 19:06:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:35.674 19:06:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:35.674 19:06:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:35.674 19:06:54 -- scripts/common.sh@335 -- # IFS=.-: 00:08:35.674 19:06:54 -- scripts/common.sh@335 -- # read -ra ver1 00:08:35.674 19:06:54 -- scripts/common.sh@336 -- # IFS=.-: 00:08:35.674 19:06:54 -- scripts/common.sh@336 -- # read -ra ver2 00:08:35.674 19:06:54 -- scripts/common.sh@337 -- # local 'op=<' 00:08:35.674 19:06:54 -- scripts/common.sh@339 -- # ver1_l=2 00:08:35.674 19:06:54 -- scripts/common.sh@340 -- # ver2_l=1 00:08:35.674 19:06:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:35.674 19:06:54 -- scripts/common.sh@343 -- # case "$op" in 00:08:35.674 19:06:54 -- scripts/common.sh@344 -- # : 1 00:08:35.674 19:06:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:35.674 19:06:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:35.674 19:06:54 -- scripts/common.sh@364 -- # decimal 1 00:08:35.674 19:06:54 -- scripts/common.sh@352 -- # local d=1 00:08:35.674 19:06:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:35.674 19:06:54 -- scripts/common.sh@354 -- # echo 1 00:08:35.674 19:06:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:35.674 19:06:54 -- scripts/common.sh@365 -- # decimal 2 00:08:35.674 19:06:54 -- scripts/common.sh@352 -- # local d=2 00:08:35.674 19:06:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:35.674 19:06:54 -- scripts/common.sh@354 -- # echo 2 00:08:35.674 19:06:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:35.674 19:06:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:35.674 19:06:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:35.674 19:06:54 -- scripts/common.sh@367 -- # return 0 00:08:35.674 19:06:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:35.674 19:06:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:35.674 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.674 --rc genhtml_branch_coverage=1 00:08:35.674 --rc genhtml_function_coverage=1 00:08:35.674 --rc genhtml_legend=1 00:08:35.674 --rc geninfo_all_blocks=1 00:08:35.674 --rc geninfo_unexecuted_blocks=1 00:08:35.674 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.674 ' 00:08:35.674 19:06:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:35.674 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.674 --rc genhtml_branch_coverage=1 00:08:35.674 --rc genhtml_function_coverage=1 00:08:35.674 --rc genhtml_legend=1 00:08:35.674 --rc geninfo_all_blocks=1 00:08:35.674 --rc geninfo_unexecuted_blocks=1 00:08:35.674 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.674 ' 00:08:35.674 19:06:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:35.674 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.675 --rc genhtml_branch_coverage=1 00:08:35.675 --rc genhtml_function_coverage=1 00:08:35.675 --rc genhtml_legend=1 00:08:35.675 --rc geninfo_all_blocks=1 00:08:35.675 --rc geninfo_unexecuted_blocks=1 00:08:35.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.675 ' 00:08:35.675 19:06:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:35.675 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.675 --rc genhtml_branch_coverage=1 00:08:35.675 --rc genhtml_function_coverage=1 00:08:35.675 --rc genhtml_legend=1 00:08:35.675 --rc geninfo_all_blocks=1 00:08:35.675 --rc geninfo_unexecuted_blocks=1 00:08:35.675 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.675 ' 00:08:35.675 19:06:54 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:35.675 19:06:54 -- ../common.sh@8 -- # pids=() 00:08:35.675 19:06:54 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.675 19:06:54 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.675 19:06:54 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:35.675 19:06:54 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:35.675 19:06:54 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:35.675 19:06:54 -- vfio/run.sh@65 -- # mem_size=0 00:08:35.675 19:06:54 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:35.675 19:06:54 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:35.675 19:06:54 -- ../common.sh@69 -- # local fuzz_num=7 00:08:35.675 19:06:54 -- ../common.sh@70 -- # local time=1 00:08:35.675 19:06:54 -- ../common.sh@72 -- # (( i = 0 )) 00:08:35.675 19:06:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.675 19:06:54 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:35.675 19:06:54 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:35.675 19:06:54 -- vfio/run.sh@23 -- # local timen=1 00:08:35.675 19:06:54 -- vfio/run.sh@24 -- # local core=0x1 00:08:35.675 19:06:54 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.675 19:06:54 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:35.675 19:06:54 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:35.675 19:06:54 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:35.675 19:06:54 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:35.675 19:06:54 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.675 19:06:54 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:35.675 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:35.675 19:06:54 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:35.934 [2024-11-18 19:06:54.286198] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:35.934 [2024-11-18 19:06:54.286284] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1310682 ] 00:08:35.934 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.934 [2024-11-18 19:06:54.360399] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.934 [2024-11-18 19:06:54.432050] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:35.934 [2024-11-18 19:06:54.432190] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.194 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.194 INFO: Seed: 3166333312 00:08:36.194 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:36.194 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:36.194 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.194 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.194 #2 INITED exec/s: 0 rss: 62Mb 00:08:36.194 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.194 This may also happen if the target rejected all inputs we tried so far 00:08:36.712 NEW_FUNC[1/631]: 0x43a218 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:36.712 NEW_FUNC[2/631]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:36.712 #7 NEW cov: 10765 ft: 10689 corp: 2/38b lim: 60 exec/s: 0 rss: 67Mb L: 37/37 MS: 5 ShuffleBytes-InsertByte-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:36.712 #8 NEW cov: 10782 ft: 14484 corp: 3/69b lim: 60 exec/s: 0 rss: 69Mb L: 31/37 MS: 1 EraseBytes- 00:08:36.971 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.971 #10 NEW cov: 10806 ft: 15494 corp: 4/75b lim: 60 exec/s: 0 rss: 70Mb L: 6/37 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:37.230 #11 NEW cov: 10806 ft: 16009 corp: 5/106b lim: 60 exec/s: 11 rss: 70Mb L: 31/37 MS: 1 ChangeBit- 00:08:37.489 #12 NEW cov: 10806 ft: 16232 corp: 6/166b lim: 60 exec/s: 12 rss: 70Mb L: 60/60 MS: 1 CrossOver- 00:08:37.748 #13 NEW cov: 10806 ft: 16509 corp: 7/172b lim: 60 exec/s: 13 rss: 70Mb L: 6/60 MS: 1 ChangeBinInt- 00:08:37.748 #19 NEW cov: 10806 ft: 16757 corp: 8/201b lim: 60 exec/s: 19 rss: 70Mb L: 29/60 MS: 1 EraseBytes- 00:08:38.008 #20 NEW cov: 10813 ft: 16889 corp: 9/232b lim: 60 exec/s: 20 rss: 70Mb L: 31/60 MS: 1 ShuffleBytes- 00:08:38.269 #21 NEW cov: 10813 ft: 17313 corp: 10/238b lim: 60 exec/s: 10 rss: 70Mb L: 6/60 MS: 1 CopyPart- 00:08:38.269 #21 DONE cov: 10813 ft: 17313 corp: 10/238b lim: 60 exec/s: 10 rss: 70Mb 00:08:38.269 Done 21 runs in 2 second(s) 00:08:38.529 19:06:56 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:38.529 19:06:56 -- ../common.sh@72 -- # (( i++ )) 00:08:38.529 19:06:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.529 19:06:56 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:38.529 19:06:56 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:38.529 19:06:56 -- vfio/run.sh@23 -- # local timen=1 00:08:38.529 19:06:56 -- vfio/run.sh@24 -- # local core=0x1 00:08:38.529 19:06:56 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.529 19:06:56 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:38.529 19:06:56 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:38.529 19:06:56 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:38.529 19:06:56 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:38.529 19:06:56 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.529 19:06:56 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:38.529 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:38.529 19:06:56 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:38.529 [2024-11-18 19:06:57.004687] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:38.529 [2024-11-18 19:06:57.004782] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1311201 ] 00:08:38.529 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.529 [2024-11-18 19:06:57.077856] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.789 [2024-11-18 19:06:57.146625] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:38.789 [2024-11-18 19:06:57.146758] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.789 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.789 INFO: Seed: 1585364236 00:08:38.789 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:38.789 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:38.789 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.789 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.789 #2 INITED exec/s: 0 rss: 62Mb 00:08:38.789 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.789 This may also happen if the target rejected all inputs we tried so far 00:08:39.047 [2024-11-18 19:06:57.401628] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.047 [2024-11-18 19:06:57.401661] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.047 [2024-11-18 19:06:57.401679] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.306 NEW_FUNC[1/638]: 0x43a7b8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:39.306 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:39.306 #8 NEW cov: 10782 ft: 10408 corp: 2/6b lim: 40 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 CMP- DE: "!\000\000\000"- 00:08:39.306 [2024-11-18 19:06:57.815303] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.306 [2024-11-18 19:06:57.815338] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.306 [2024-11-18 19:06:57.815358] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.306 #9 NEW cov: 10796 ft: 13338 corp: 3/11b lim: 40 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 CopyPart- 00:08:39.564 [2024-11-18 19:06:57.929143] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.564 [2024-11-18 19:06:57.929169] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.564 [2024-11-18 19:06:57.929187] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.564 #10 NEW cov: 10796 ft: 14580 corp: 4/16b lim: 40 exec/s: 0 rss: 70Mb L: 5/5 MS: 1 ChangeBit- 00:08:39.564 [2024-11-18 19:06:58.043992] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.564 [2024-11-18 19:06:58.044019] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.564 [2024-11-18 19:06:58.044039] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.564 #11 NEW cov: 10796 ft: 15356 corp: 5/54b lim: 40 exec/s: 0 rss: 70Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:39.823 [2024-11-18 19:06:58.167890] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.823 [2024-11-18 19:06:58.167917] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.823 [2024-11-18 19:06:58.167936] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.823 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.823 #12 NEW cov: 10813 ft: 15511 corp: 6/59b lim: 40 exec/s: 0 rss: 70Mb L: 5/38 MS: 1 ChangeByte- 00:08:39.823 [2024-11-18 19:06:58.282794] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.823 [2024-11-18 19:06:58.282820] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.823 [2024-11-18 19:06:58.282839] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.823 #13 NEW cov: 10813 ft: 15707 corp: 7/64b lim: 40 exec/s: 13 rss: 70Mb L: 5/38 MS: 1 ShuffleBytes- 00:08:39.823 [2024-11-18 19:06:58.397652] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.823 [2024-11-18 19:06:58.397677] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.823 [2024-11-18 19:06:58.397695] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.082 #14 NEW cov: 10813 ft: 16067 corp: 8/81b lim: 40 exec/s: 14 rss: 70Mb L: 17/38 MS: 1 InsertRepeatedBytes- 00:08:40.082 [2024-11-18 19:06:58.511518] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.082 [2024-11-18 19:06:58.511543] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.082 [2024-11-18 19:06:58.511570] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.082 #17 NEW cov: 10813 ft: 16126 corp: 9/85b lim: 40 exec/s: 17 rss: 70Mb L: 4/38 MS: 3 EraseBytes-ChangeByte-InsertByte- 00:08:40.082 [2024-11-18 19:06:58.627353] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.082 [2024-11-18 19:06:58.627377] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.082 [2024-11-18 19:06:58.627396] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.341 #18 NEW cov: 10813 ft: 16693 corp: 10/123b lim: 40 exec/s: 18 rss: 70Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:40.341 [2024-11-18 19:06:58.741347] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.341 [2024-11-18 19:06:58.741372] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.341 [2024-11-18 19:06:58.741391] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.341 #19 NEW cov: 10813 ft: 16807 corp: 11/127b lim: 40 exec/s: 19 rss: 70Mb L: 4/38 MS: 1 ChangeBit- 00:08:40.341 [2024-11-18 19:06:58.855205] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.341 [2024-11-18 19:06:58.855229] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.341 [2024-11-18 19:06:58.855247] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.341 #20 NEW cov: 10813 ft: 17013 corp: 12/132b lim: 40 exec/s: 20 rss: 70Mb L: 5/38 MS: 1 CMP- DE: "\377\377\377\371"- 00:08:40.600 [2024-11-18 19:06:58.968196] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.600 [2024-11-18 19:06:58.968220] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.600 [2024-11-18 19:06:58.968239] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.600 #21 NEW cov: 10813 ft: 17048 corp: 13/137b lim: 40 exec/s: 21 rss: 70Mb L: 5/38 MS: 1 ChangeBinInt- 00:08:40.600 [2024-11-18 19:06:59.082048] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.600 [2024-11-18 19:06:59.082073] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.600 [2024-11-18 19:06:59.082091] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.600 #22 NEW cov: 10820 ft: 17318 corp: 14/175b lim: 40 exec/s: 22 rss: 70Mb L: 38/38 MS: 1 ChangeByte- 00:08:40.600 [2024-11-18 19:06:59.194950] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.600 [2024-11-18 19:06:59.194975] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.600 [2024-11-18 19:06:59.194994] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.859 #23 NEW cov: 10820 ft: 17355 corp: 15/180b lim: 40 exec/s: 23 rss: 70Mb L: 5/38 MS: 1 ShuffleBytes- 00:08:40.859 [2024-11-18 19:06:59.309780] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.859 [2024-11-18 19:06:59.309804] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.859 [2024-11-18 19:06:59.309824] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.859 #24 NEW cov: 10820 ft: 17420 corp: 16/185b lim: 40 exec/s: 12 rss: 70Mb L: 5/38 MS: 1 ChangeBit- 00:08:40.859 #24 DONE cov: 10820 ft: 17420 corp: 16/185b lim: 40 exec/s: 12 rss: 70Mb 00:08:40.859 ###### Recommended dictionary. ###### 00:08:40.859 "!\000\000\000" # Uses: 0 00:08:40.859 "\377\377\377\371" # Uses: 0 00:08:40.859 ###### End of recommended dictionary. ###### 00:08:40.859 Done 24 runs in 2 second(s) 00:08:41.118 19:06:59 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:41.118 19:06:59 -- ../common.sh@72 -- # (( i++ )) 00:08:41.118 19:06:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.118 19:06:59 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:41.118 19:06:59 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:41.118 19:06:59 -- vfio/run.sh@23 -- # local timen=1 00:08:41.118 19:06:59 -- vfio/run.sh@24 -- # local core=0x1 00:08:41.118 19:06:59 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.118 19:06:59 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:41.118 19:06:59 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:41.118 19:06:59 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:41.118 19:06:59 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:41.118 19:06:59 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.118 19:06:59 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:41.118 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:41.118 19:06:59 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:41.118 [2024-11-18 19:06:59.688118] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:41.118 [2024-11-18 19:06:59.688192] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1311540 ] 00:08:41.377 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.377 [2024-11-18 19:06:59.762266] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.377 [2024-11-18 19:06:59.830664] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:41.377 [2024-11-18 19:06:59.830810] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.636 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.636 INFO: Seed: 4275375394 00:08:41.636 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:41.636 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:41.636 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.636 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.636 #2 INITED exec/s: 0 rss: 62Mb 00:08:41.636 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.636 This may also happen if the target rejected all inputs we tried so far 00:08:41.636 [2024-11-18 19:07:00.132583] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:41.636 [2024-11-18 19:07:00.132632] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:42.155 NEW_FUNC[1/638]: 0x43b1a8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:42.155 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:42.155 #7 NEW cov: 10771 ft: 10741 corp: 2/12b lim: 80 exec/s: 0 rss: 68Mb L: 11/11 MS: 5 ChangeBit-InsertByte-ChangeBit-CrossOver-CMP- DE: "\000\000\000\000\000\000\215-"- 00:08:42.155 [2024-11-18 19:07:00.593818] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.155 #8 NEW cov: 10789 ft: 14090 corp: 3/24b lim: 80 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 InsertByte- 00:08:42.414 [2024-11-18 19:07:00.781601] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:42.414 [2024-11-18 19:07:00.781645] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:42.414 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:42.414 #9 NEW cov: 10806 ft: 14706 corp: 4/35b lim: 80 exec/s: 0 rss: 70Mb L: 11/12 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\215-"- 00:08:42.414 [2024-11-18 19:07:00.968373] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:42.414 [2024-11-18 19:07:00.968402] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:42.673 #10 NEW cov: 10806 ft: 15233 corp: 5/46b lim: 80 exec/s: 10 rss: 70Mb L: 11/12 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\215-"- 00:08:42.673 [2024-11-18 19:07:01.155929] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.673 #11 NEW cov: 10806 ft: 16062 corp: 6/121b lim: 80 exec/s: 11 rss: 70Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:08:42.933 [2024-11-18 19:07:01.355235] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: cmd 5 failed: Invalid argument 00:08:42.933 [2024-11-18 19:07:01.355268] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:42.933 #12 NEW cov: 10806 ft: 16504 corp: 7/132b lim: 80 exec/s: 12 rss: 70Mb L: 11/75 MS: 1 ChangeByte- 00:08:43.192 [2024-11-18 19:07:01.541731] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.192 #13 NEW cov: 10806 ft: 16682 corp: 8/210b lim: 80 exec/s: 13 rss: 70Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:08:43.192 [2024-11-18 19:07:01.730692] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: cmd 5 failed: Invalid argument 00:08:43.192 [2024-11-18 19:07:01.730723] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:43.451 #14 NEW cov: 10813 ft: 16690 corp: 9/225b lim: 80 exec/s: 14 rss: 70Mb L: 15/78 MS: 1 CopyPart- 00:08:43.451 [2024-11-18 19:07:01.918741] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.451 #15 NEW cov: 10813 ft: 16729 corp: 10/300b lim: 80 exec/s: 15 rss: 70Mb L: 75/78 MS: 1 CMP- DE: "\000\000\000\000\000\000\243\222"- 00:08:43.711 [2024-11-18 19:07:02.105678] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:43.711 [2024-11-18 19:07:02.105708] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:43.711 #21 NEW cov: 10813 ft: 16844 corp: 11/311b lim: 80 exec/s: 10 rss: 70Mb L: 11/78 MS: 1 ShuffleBytes- 00:08:43.711 #21 DONE cov: 10813 ft: 16844 corp: 11/311b lim: 80 exec/s: 10 rss: 70Mb 00:08:43.711 ###### Recommended dictionary. ###### 00:08:43.711 "\000\000\000\000\000\000\215-" # Uses: 2 00:08:43.711 "\000\000\000\000\000\000\243\222" # Uses: 0 00:08:43.711 ###### End of recommended dictionary. ###### 00:08:43.711 Done 21 runs in 2 second(s) 00:08:43.971 19:07:02 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:43.971 19:07:02 -- ../common.sh@72 -- # (( i++ )) 00:08:43.971 19:07:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.971 19:07:02 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:43.971 19:07:02 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:43.971 19:07:02 -- vfio/run.sh@23 -- # local timen=1 00:08:43.971 19:07:02 -- vfio/run.sh@24 -- # local core=0x1 00:08:43.971 19:07:02 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:43.971 19:07:02 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:43.971 19:07:02 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:43.971 19:07:02 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:43.971 19:07:02 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:43.971 19:07:02 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:43.971 19:07:02 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:43.971 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:43.971 19:07:02 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:43.971 [2024-11-18 19:07:02.527831] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:43.971 [2024-11-18 19:07:02.527907] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1312070 ] 00:08:43.971 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.231 [2024-11-18 19:07:02.600133] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.231 [2024-11-18 19:07:02.665774] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:44.231 [2024-11-18 19:07:02.665927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.490 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.490 INFO: Seed: 2816428224 00:08:44.490 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:44.490 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:44.490 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.490 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.490 #2 INITED exec/s: 0 rss: 61Mb 00:08:44.490 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.490 This may also happen if the target rejected all inputs we tried so far 00:08:44.750 NEW_FUNC[1/631]: 0x43b898 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:44.750 NEW_FUNC[2/631]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:44.750 #4 NEW cov: 10721 ft: 10701 corp: 2/109b lim: 320 exec/s: 0 rss: 68Mb L: 108/108 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:45.009 NEW_FUNC[1/1]: 0x112d458 in nvmf_bdev_ctrlr_write_cmd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr_bdev.c:325 00:08:45.009 #10 NEW cov: 10759 ft: 13265 corp: 3/217b lim: 320 exec/s: 0 rss: 69Mb L: 108/108 MS: 1 ShuffleBytes- 00:08:45.268 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:45.268 #11 NEW cov: 10776 ft: 14966 corp: 4/424b lim: 320 exec/s: 0 rss: 70Mb L: 207/207 MS: 1 InsertRepeatedBytes- 00:08:45.527 #12 NEW cov: 10776 ft: 15725 corp: 5/573b lim: 320 exec/s: 12 rss: 70Mb L: 149/207 MS: 1 CopyPart- 00:08:45.787 #13 NEW cov: 10776 ft: 16156 corp: 6/627b lim: 320 exec/s: 13 rss: 70Mb L: 54/207 MS: 1 CrossOver- 00:08:45.787 #14 NEW cov: 10776 ft: 16300 corp: 7/834b lim: 320 exec/s: 14 rss: 70Mb L: 207/207 MS: 1 ShuffleBytes- 00:08:46.046 #15 NEW cov: 10776 ft: 16566 corp: 8/1050b lim: 320 exec/s: 15 rss: 70Mb L: 216/216 MS: 1 CopyPart- 00:08:46.304 #16 NEW cov: 10783 ft: 16783 corp: 9/1104b lim: 320 exec/s: 16 rss: 70Mb L: 54/216 MS: 1 ChangeByte- 00:08:46.564 #17 NEW cov: 10783 ft: 16875 corp: 10/1212b lim: 320 exec/s: 8 rss: 70Mb L: 108/216 MS: 1 CMP- DE: "Y\000\000\000"- 00:08:46.564 #17 DONE cov: 10783 ft: 16875 corp: 10/1212b lim: 320 exec/s: 8 rss: 70Mb 00:08:46.564 ###### Recommended dictionary. ###### 00:08:46.564 "Y\000\000\000" # Uses: 0 00:08:46.564 ###### End of recommended dictionary. ###### 00:08:46.564 Done 17 runs in 2 second(s) 00:08:46.824 19:07:05 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:46.824 19:07:05 -- ../common.sh@72 -- # (( i++ )) 00:08:46.824 19:07:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.824 19:07:05 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:46.824 19:07:05 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:46.824 19:07:05 -- vfio/run.sh@23 -- # local timen=1 00:08:46.824 19:07:05 -- vfio/run.sh@24 -- # local core=0x1 00:08:46.824 19:07:05 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.824 19:07:05 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:46.824 19:07:05 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:46.824 19:07:05 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:46.824 19:07:05 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:46.824 19:07:05 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.824 19:07:05 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:46.824 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.824 19:07:05 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:46.824 [2024-11-18 19:07:05.225011] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:46.824 [2024-11-18 19:07:05.225104] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1312608 ] 00:08:46.824 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.824 [2024-11-18 19:07:05.299254] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.824 [2024-11-18 19:07:05.365636] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:46.824 [2024-11-18 19:07:05.365791] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.084 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.084 INFO: Seed: 1216448254 00:08:47.084 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:47.084 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:47.084 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.084 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.084 #2 INITED exec/s: 0 rss: 62Mb 00:08:47.084 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.084 This may also happen if the target rejected all inputs we tried so far 00:08:47.603 NEW_FUNC[1/632]: 0x43c118 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:47.603 NEW_FUNC[2/632]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:47.603 #4 NEW cov: 10744 ft: 10652 corp: 2/72b lim: 320 exec/s: 0 rss: 68Mb L: 71/71 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:47.862 #5 NEW cov: 10761 ft: 13396 corp: 3/164b lim: 320 exec/s: 0 rss: 69Mb L: 92/92 MS: 1 CopyPart- 00:08:47.862 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:47.862 #6 NEW cov: 10781 ft: 14580 corp: 4/266b lim: 320 exec/s: 0 rss: 70Mb L: 102/102 MS: 1 CrossOver- 00:08:48.121 #12 NEW cov: 10781 ft: 14941 corp: 5/368b lim: 320 exec/s: 12 rss: 70Mb L: 102/102 MS: 1 ChangeBit- 00:08:48.381 #13 NEW cov: 10781 ft: 15347 corp: 6/450b lim: 320 exec/s: 13 rss: 70Mb L: 82/102 MS: 1 EraseBytes- 00:08:48.640 #14 NEW cov: 10781 ft: 15466 corp: 7/500b lim: 320 exec/s: 14 rss: 70Mb L: 50/102 MS: 1 EraseBytes- 00:08:48.899 #15 NEW cov: 10781 ft: 15494 corp: 8/642b lim: 320 exec/s: 15 rss: 70Mb L: 142/142 MS: 1 CrossOver- 00:08:48.899 #16 NEW cov: 10788 ft: 15635 corp: 9/677b lim: 320 exec/s: 16 rss: 70Mb L: 35/142 MS: 1 EraseBytes- 00:08:49.159 #17 NEW cov: 10788 ft: 16360 corp: 10/779b lim: 320 exec/s: 8 rss: 70Mb L: 102/142 MS: 1 ChangeBinInt- 00:08:49.159 #17 DONE cov: 10788 ft: 16360 corp: 10/779b lim: 320 exec/s: 8 rss: 70Mb 00:08:49.159 Done 17 runs in 2 second(s) 00:08:49.419 19:07:07 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:49.419 19:07:07 -- ../common.sh@72 -- # (( i++ )) 00:08:49.419 19:07:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.419 19:07:07 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:49.419 19:07:07 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:49.419 19:07:07 -- vfio/run.sh@23 -- # local timen=1 00:08:49.419 19:07:07 -- vfio/run.sh@24 -- # local core=0x1 00:08:49.419 19:07:07 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.419 19:07:07 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:49.419 19:07:07 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:49.419 19:07:07 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:49.419 19:07:07 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:49.419 19:07:07 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.419 19:07:07 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:49.419 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.419 19:07:07 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:49.419 [2024-11-18 19:07:07.944744] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:49.419 [2024-11-18 19:07:07.944826] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1313152 ] 00:08:49.419 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.419 [2024-11-18 19:07:08.019742] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.678 [2024-11-18 19:07:08.086895] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:49.678 [2024-11-18 19:07:08.087032] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.678 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.678 INFO: Seed: 3937436948 00:08:49.936 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:49.937 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:49.937 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.937 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.937 #2 INITED exec/s: 0 rss: 62Mb 00:08:49.937 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.937 This may also happen if the target rejected all inputs we tried so far 00:08:49.937 [2024-11-18 19:07:08.376583] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:49.937 [2024-11-18 19:07:08.376702] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.196 NEW_FUNC[1/638]: 0x43cb18 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:50.196 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.196 #11 NEW cov: 10784 ft: 10717 corp: 2/80b lim: 120 exec/s: 0 rss: 66Mb L: 79/79 MS: 4 ShuffleBytes-CMP-CopyPart-InsertRepeatedBytes- DE: "+\337G\271\227\262\213\000"- 00:08:50.455 [2024-11-18 19:07:08.838259] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.456 [2024-11-18 19:07:08.838303] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.456 #26 NEW cov: 10798 ft: 12938 corp: 3/197b lim: 120 exec/s: 0 rss: 68Mb L: 117/117 MS: 5 ChangeByte-ShuffleBytes-CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:50.456 [2024-11-18 19:07:09.035321] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.456 [2024-11-18 19:07:09.035351] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.715 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:50.715 #27 NEW cov: 10815 ft: 13644 corp: 4/276b lim: 120 exec/s: 0 rss: 69Mb L: 79/117 MS: 1 CrossOver- 00:08:50.715 [2024-11-18 19:07:09.221861] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.715 [2024-11-18 19:07:09.221892] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.975 #28 NEW cov: 10815 ft: 14999 corp: 5/395b lim: 120 exec/s: 28 rss: 69Mb L: 119/119 MS: 1 CopyPart- 00:08:50.975 [2024-11-18 19:07:09.408312] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.975 [2024-11-18 19:07:09.408343] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.975 #29 NEW cov: 10815 ft: 15307 corp: 6/515b lim: 120 exec/s: 29 rss: 69Mb L: 120/120 MS: 1 CrossOver- 00:08:51.234 [2024-11-18 19:07:09.594825] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.234 [2024-11-18 19:07:09.594856] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.234 #30 NEW cov: 10815 ft: 15373 corp: 7/635b lim: 120 exec/s: 30 rss: 69Mb L: 120/120 MS: 1 ChangeByte- 00:08:51.234 [2024-11-18 19:07:09.782357] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.234 [2024-11-18 19:07:09.782386] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.493 #34 NEW cov: 10815 ft: 15622 corp: 8/681b lim: 120 exec/s: 34 rss: 69Mb L: 46/120 MS: 4 ChangeByte-InsertByte-CopyPart-InsertRepeatedBytes- 00:08:51.493 [2024-11-18 19:07:09.965619] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.493 [2024-11-18 19:07:09.965664] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.493 #35 NEW cov: 10815 ft: 16315 corp: 9/800b lim: 120 exec/s: 35 rss: 69Mb L: 119/120 MS: 1 ShuffleBytes- 00:08:51.752 [2024-11-18 19:07:10.152162] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.752 [2024-11-18 19:07:10.152192] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.752 #36 NEW cov: 10822 ft: 16388 corp: 10/919b lim: 120 exec/s: 36 rss: 69Mb L: 119/120 MS: 1 PersAutoDict- DE: "+\337G\271\227\262\213\000"- 00:08:51.752 [2024-11-18 19:07:10.335661] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.752 [2024-11-18 19:07:10.335691] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.012 #37 NEW cov: 10822 ft: 16442 corp: 11/1039b lim: 120 exec/s: 18 rss: 69Mb L: 120/120 MS: 1 CMP- DE: "\004\000"- 00:08:52.012 #37 DONE cov: 10822 ft: 16442 corp: 11/1039b lim: 120 exec/s: 18 rss: 69Mb 00:08:52.012 ###### Recommended dictionary. ###### 00:08:52.012 "+\337G\271\227\262\213\000" # Uses: 2 00:08:52.012 "\004\000" # Uses: 0 00:08:52.012 ###### End of recommended dictionary. ###### 00:08:52.012 Done 37 runs in 2 second(s) 00:08:52.272 19:07:10 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:52.272 19:07:10 -- ../common.sh@72 -- # (( i++ )) 00:08:52.272 19:07:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.272 19:07:10 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:52.272 19:07:10 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:52.272 19:07:10 -- vfio/run.sh@23 -- # local timen=1 00:08:52.272 19:07:10 -- vfio/run.sh@24 -- # local core=0x1 00:08:52.272 19:07:10 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.272 19:07:10 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:52.272 19:07:10 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:52.272 19:07:10 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:52.272 19:07:10 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:52.272 19:07:10 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.272 19:07:10 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:52.272 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.272 19:07:10 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:52.272 [2024-11-18 19:07:10.755309] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:52.272 [2024-11-18 19:07:10.755386] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1313594 ] 00:08:52.272 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.272 [2024-11-18 19:07:10.828023] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.532 [2024-11-18 19:07:10.898229] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:52.532 [2024-11-18 19:07:10.898365] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.532 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.532 INFO: Seed: 2451482673 00:08:52.532 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:52.532 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:52.532 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.532 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.532 #2 INITED exec/s: 0 rss: 62Mb 00:08:52.532 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.532 This may also happen if the target rejected all inputs we tried so far 00:08:52.791 [2024-11-18 19:07:11.203727] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.791 [2024-11-18 19:07:11.203799] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.051 NEW_FUNC[1/638]: 0x43d808 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:53.051 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:53.051 #13 NEW cov: 10776 ft: 10521 corp: 2/44b lim: 90 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:08:53.311 [2024-11-18 19:07:11.695875] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.311 [2024-11-18 19:07:11.695915] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.311 #14 NEW cov: 10790 ft: 14127 corp: 3/87b lim: 90 exec/s: 0 rss: 69Mb L: 43/43 MS: 1 ChangeBinInt- 00:08:53.311 [2024-11-18 19:07:11.900297] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.311 [2024-11-18 19:07:11.900326] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.570 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.570 #15 NEW cov: 10807 ft: 15503 corp: 4/130b lim: 90 exec/s: 0 rss: 70Mb L: 43/43 MS: 1 ChangeByte- 00:08:53.570 [2024-11-18 19:07:12.099608] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.570 [2024-11-18 19:07:12.099638] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.830 #16 NEW cov: 10807 ft: 16248 corp: 5/177b lim: 90 exec/s: 16 rss: 70Mb L: 47/47 MS: 1 CMP- DE: "\000\000\000\004"- 00:08:53.830 [2024-11-18 19:07:12.297959] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.830 [2024-11-18 19:07:12.297989] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.830 #17 NEW cov: 10807 ft: 16453 corp: 6/224b lim: 90 exec/s: 17 rss: 70Mb L: 47/47 MS: 1 PersAutoDict- DE: "\000\000\000\004"- 00:08:54.089 [2024-11-18 19:07:12.499565] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.089 [2024-11-18 19:07:12.499610] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.089 #18 NEW cov: 10807 ft: 16800 corp: 7/271b lim: 90 exec/s: 18 rss: 70Mb L: 47/47 MS: 1 ShuffleBytes- 00:08:54.348 [2024-11-18 19:07:12.703724] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.348 [2024-11-18 19:07:12.703754] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.348 #19 NEW cov: 10807 ft: 16912 corp: 8/355b lim: 90 exec/s: 19 rss: 70Mb L: 84/84 MS: 1 CrossOver- 00:08:54.348 [2024-11-18 19:07:12.900825] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.348 [2024-11-18 19:07:12.900858] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.607 #20 NEW cov: 10814 ft: 17038 corp: 9/398b lim: 90 exec/s: 20 rss: 70Mb L: 43/84 MS: 1 ChangeByte- 00:08:54.607 [2024-11-18 19:07:13.106682] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.607 [2024-11-18 19:07:13.106712] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.866 #21 NEW cov: 10814 ft: 17284 corp: 10/458b lim: 90 exec/s: 10 rss: 70Mb L: 60/84 MS: 1 CopyPart- 00:08:54.866 #21 DONE cov: 10814 ft: 17284 corp: 10/458b lim: 90 exec/s: 10 rss: 70Mb 00:08:54.866 ###### Recommended dictionary. ###### 00:08:54.866 "\000\000\000\004" # Uses: 1 00:08:54.866 ###### End of recommended dictionary. ###### 00:08:54.866 Done 21 runs in 2 second(s) 00:08:55.125 19:07:13 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:55.125 19:07:13 -- ../common.sh@72 -- # (( i++ )) 00:08:55.125 19:07:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.125 19:07:13 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:55.125 00:08:55.125 real 0m19.683s 00:08:55.125 user 0m27.579s 00:08:55.125 sys 0m1.887s 00:08:55.125 19:07:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:55.125 19:07:13 -- common/autotest_common.sh@10 -- # set +x 00:08:55.125 ************************************ 00:08:55.125 END TEST vfio_fuzz 00:08:55.125 ************************************ 00:08:55.125 00:08:55.125 real 1m24.928s 00:08:55.125 user 2m8.361s 00:08:55.125 sys 0m9.996s 00:08:55.125 19:07:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:55.125 19:07:13 -- common/autotest_common.sh@10 -- # set +x 00:08:55.125 ************************************ 00:08:55.125 END TEST llvm_fuzz 00:08:55.125 ************************************ 00:08:55.125 19:07:13 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:55.125 19:07:13 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:55.125 19:07:13 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:55.125 19:07:13 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:55.125 19:07:13 -- common/autotest_common.sh@10 -- # set +x 00:08:55.125 19:07:13 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:55.125 19:07:13 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:55.125 19:07:13 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:55.125 19:07:13 -- common/autotest_common.sh@10 -- # set +x 00:09:01.698 INFO: APP EXITING 00:09:01.698 INFO: killing all VMs 00:09:01.698 INFO: killing vhost app 00:09:01.698 INFO: EXIT DONE 00:09:03.718 Waiting for block devices as requested 00:09:03.718 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:03.718 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:03.718 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.009 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.009 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.009 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.009 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:04.270 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:04.270 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.270 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.528 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.528 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.528 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.787 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.787 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:04.787 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:05.047 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:08.338 Cleaning 00:09:08.338 Removing: /dev/shm/spdk_tgt_trace.pid1275462 00:09:08.338 Removing: /var/run/dpdk/spdk_pid1272972 00:09:08.338 Removing: /var/run/dpdk/spdk_pid1274251 00:09:08.338 Removing: /var/run/dpdk/spdk_pid1275462 00:09:08.338 Removing: /var/run/dpdk/spdk_pid1276266 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1276593 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1276926 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1277273 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1277607 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1277900 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1278182 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1278500 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1279367 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1282569 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1282876 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1283191 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1283442 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1284008 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1284027 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1284600 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1284807 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1285124 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1285184 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1285474 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1285594 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1286120 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1286402 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1286675 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1286774 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1287078 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1287119 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1287409 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1287601 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1287813 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1288001 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1288275 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1288541 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1288830 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1289105 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1289388 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1289616 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1289819 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1289981 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1290252 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1290523 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1290804 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1291072 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1291359 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1291590 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1291796 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1291957 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1292221 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1292487 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1292780 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1293048 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1293334 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1293585 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1293788 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1293953 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1294202 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1294472 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1294754 00:09:08.598 Removing: /var/run/dpdk/spdk_pid1295022 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1295309 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1295569 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1295782 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1295962 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1296189 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1296451 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1296742 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1297014 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1297297 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1297370 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1297711 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1298467 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1298797 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1299302 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1299841 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1300285 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1300812 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1301687 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1302328 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1302658 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1303159 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1303700 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1304129 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1304534 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1305077 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1305482 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1305931 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1306467 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1306799 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1307300 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1307847 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1308142 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1308674 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1309161 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1309508 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1310057 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1310682 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1311201 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1311540 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1312070 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1312608 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1313152 00:09:08.857 Removing: /var/run/dpdk/spdk_pid1313594 00:09:08.857 Clean 00:09:09.117 killing process with pid 1224801 00:09:13.317 killing process with pid 1224798 00:09:13.317 killing process with pid 1224800 00:09:13.317 killing process with pid 1224799 00:09:13.317 19:07:31 -- common/autotest_common.sh@1446 -- # return 0 00:09:13.317 19:07:31 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:13.317 19:07:31 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:13.317 19:07:31 -- common/autotest_common.sh@10 -- # set +x 00:09:13.317 19:07:31 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:13.317 19:07:31 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:13.317 19:07:31 -- common/autotest_common.sh@10 -- # set +x 00:09:13.317 19:07:31 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:13.317 19:07:31 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:13.317 19:07:31 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:13.317 19:07:31 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:13.317 19:07:31 -- spdk/autotest.sh@383 -- # hostname 00:09:13.317 19:07:31 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:13.317 geninfo: WARNING: invalid characters removed from testname! 00:09:13.885 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:13.885 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:13.885 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:26.100 19:07:42 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:31.374 19:07:49 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:35.565 19:07:54 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:40.840 19:07:58 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:45.031 19:08:03 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:50.304 19:08:08 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:54.498 19:08:12 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:54.498 19:08:12 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:54.498 19:08:12 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:54.498 19:08:12 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:54.498 19:08:12 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:54.498 19:08:12 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:54.498 19:08:12 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:54.499 19:08:12 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:54.499 19:08:12 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:54.499 19:08:12 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:54.499 19:08:12 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:54.499 19:08:12 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:54.499 19:08:12 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:54.499 19:08:12 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:54.499 19:08:12 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:54.499 19:08:12 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:54.499 19:08:12 -- scripts/common.sh@343 -- $ case "$op" in 00:09:54.499 19:08:12 -- scripts/common.sh@344 -- $ : 1 00:09:54.499 19:08:12 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:54.499 19:08:12 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:54.499 19:08:12 -- scripts/common.sh@364 -- $ decimal 1 00:09:54.499 19:08:12 -- scripts/common.sh@352 -- $ local d=1 00:09:54.499 19:08:12 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:54.499 19:08:12 -- scripts/common.sh@354 -- $ echo 1 00:09:54.499 19:08:12 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:54.499 19:08:12 -- scripts/common.sh@365 -- $ decimal 2 00:09:54.499 19:08:12 -- scripts/common.sh@352 -- $ local d=2 00:09:54.499 19:08:12 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:54.499 19:08:12 -- scripts/common.sh@354 -- $ echo 2 00:09:54.499 19:08:12 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:54.499 19:08:12 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:54.499 19:08:12 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:54.499 19:08:12 -- scripts/common.sh@367 -- $ return 0 00:09:54.499 19:08:12 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:54.499 19:08:12 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:54.499 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.499 --rc genhtml_branch_coverage=1 00:09:54.499 --rc genhtml_function_coverage=1 00:09:54.499 --rc genhtml_legend=1 00:09:54.499 --rc geninfo_all_blocks=1 00:09:54.499 --rc geninfo_unexecuted_blocks=1 00:09:54.499 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.499 ' 00:09:54.499 19:08:12 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:54.499 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.499 --rc genhtml_branch_coverage=1 00:09:54.499 --rc genhtml_function_coverage=1 00:09:54.499 --rc genhtml_legend=1 00:09:54.499 --rc geninfo_all_blocks=1 00:09:54.499 --rc geninfo_unexecuted_blocks=1 00:09:54.499 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.499 ' 00:09:54.499 19:08:12 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:54.499 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.499 --rc genhtml_branch_coverage=1 00:09:54.499 --rc genhtml_function_coverage=1 00:09:54.499 --rc genhtml_legend=1 00:09:54.499 --rc geninfo_all_blocks=1 00:09:54.499 --rc geninfo_unexecuted_blocks=1 00:09:54.499 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.499 ' 00:09:54.499 19:08:12 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:54.499 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.499 --rc genhtml_branch_coverage=1 00:09:54.499 --rc genhtml_function_coverage=1 00:09:54.499 --rc genhtml_legend=1 00:09:54.499 --rc geninfo_all_blocks=1 00:09:54.499 --rc geninfo_unexecuted_blocks=1 00:09:54.499 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.499 ' 00:09:54.499 19:08:12 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:54.499 19:08:12 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:54.499 19:08:12 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:54.499 19:08:12 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:54.499 19:08:12 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.499 19:08:12 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.499 19:08:12 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.499 19:08:12 -- paths/export.sh@5 -- $ export PATH 00:09:54.499 19:08:12 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.499 19:08:12 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:54.499 19:08:12 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:54.499 19:08:12 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1731953292.XXXXXX 00:09:54.499 19:08:12 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1731953292.GpigSD 00:09:54.499 19:08:12 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:54.499 19:08:12 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:09:54.499 19:08:12 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:54.499 19:08:12 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:54.499 19:08:12 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:54.499 19:08:12 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:54.499 19:08:12 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:54.499 19:08:12 -- common/autotest_common.sh@10 -- $ set +x 00:09:54.499 19:08:12 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:54.499 19:08:12 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:54.499 19:08:12 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:54.499 19:08:12 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:54.499 19:08:12 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:54.499 19:08:12 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:54.499 19:08:12 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:54.499 19:08:12 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:54.499 19:08:12 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:54.499 19:08:12 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:54.499 19:08:12 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:54.499 + [[ -n 1180866 ]] 00:09:54.499 + sudo kill 1180866 00:09:54.510 [Pipeline] } 00:09:54.526 [Pipeline] // stage 00:09:54.531 [Pipeline] } 00:09:54.546 [Pipeline] // timeout 00:09:54.552 [Pipeline] } 00:09:54.567 [Pipeline] // catchError 00:09:54.572 [Pipeline] } 00:09:54.588 [Pipeline] // wrap 00:09:54.595 [Pipeline] } 00:09:54.609 [Pipeline] // catchError 00:09:54.619 [Pipeline] stage 00:09:54.621 [Pipeline] { (Epilogue) 00:09:54.635 [Pipeline] catchError 00:09:54.637 [Pipeline] { 00:09:54.650 [Pipeline] echo 00:09:54.652 Cleanup processes 00:09:54.658 [Pipeline] sh 00:09:54.945 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:54.945 1322958 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:54.959 [Pipeline] sh 00:09:55.246 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.246 ++ grep -v 'sudo pgrep' 00:09:55.246 ++ awk '{print $1}' 00:09:55.246 + sudo kill -9 00:09:55.246 + true 00:09:55.258 [Pipeline] sh 00:09:55.545 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:55.545 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:55.545 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:56.544 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:06.534 [Pipeline] sh 00:10:06.821 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:06.822 Artifacts sizes are good 00:10:06.836 [Pipeline] archiveArtifacts 00:10:06.844 Archiving artifacts 00:10:06.970 [Pipeline] sh 00:10:07.255 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:07.269 [Pipeline] cleanWs 00:10:07.278 [WS-CLEANUP] Deleting project workspace... 00:10:07.278 [WS-CLEANUP] Deferred wipeout is used... 00:10:07.285 [WS-CLEANUP] done 00:10:07.287 [Pipeline] } 00:10:07.304 [Pipeline] // catchError 00:10:07.315 [Pipeline] sh 00:10:07.598 + logger -p user.info -t JENKINS-CI 00:10:07.607 [Pipeline] } 00:10:07.620 [Pipeline] // stage 00:10:07.624 [Pipeline] } 00:10:07.638 [Pipeline] // node 00:10:07.643 [Pipeline] End of Pipeline 00:10:07.680 Finished: SUCCESS