00:00:00.001 Started by upstream project "autotest-per-patch" build number 120541 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "jbp-per-patch" build number 21500 00:00:00.001 originally caused by: 00:00:00.002 Started by user sys_sgci 00:00:00.016 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.016 The recommended git tool is: git 00:00:00.017 using credential 00000000-0000-0000-0000-000000000002 00:00:00.018 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.029 Fetching changes from the remote Git repository 00:00:00.032 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.061 Using shallow fetch with depth 1 00:00:00.061 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.061 > git --version # timeout=10 00:00:00.073 > git --version # 'git version 2.39.2' 00:00:00.073 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.076 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.076 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/changes/39/22839/2 # timeout=5 00:00:02.990 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.000 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.012 Checking out Revision f7115024b58324eb1821d2923066970ea28490fc (FETCH_HEAD) 00:00:03.012 > git config core.sparsecheckout # timeout=10 00:00:03.022 > git read-tree -mu HEAD # timeout=10 00:00:03.038 > git checkout -f f7115024b58324eb1821d2923066970ea28490fc # timeout=5 00:00:03.053 Commit message: "jobs/autotest-upstream: Enable ASan, UBSan on all jobs" 00:00:03.054 > git rev-list --no-walk 77e645413453ce9660898a799e28995c970fadc7 # timeout=10 00:00:03.146 [Pipeline] Start of Pipeline 00:00:03.162 [Pipeline] library 00:00:03.163 Loading library shm_lib@master 00:00:03.163 Library shm_lib@master is cached. Copying from home. 00:00:03.181 [Pipeline] node 00:00:03.194 Running on WFP39 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.196 [Pipeline] { 00:00:03.208 [Pipeline] catchError 00:00:03.210 [Pipeline] { 00:00:03.222 [Pipeline] wrap 00:00:03.232 [Pipeline] { 00:00:03.243 [Pipeline] stage 00:00:03.245 [Pipeline] { (Prologue) 00:00:03.427 [Pipeline] sh 00:00:03.739 + logger -p user.info -t JENKINS-CI 00:00:03.754 [Pipeline] echo 00:00:03.756 Node: WFP39 00:00:03.762 [Pipeline] sh 00:00:04.058 [Pipeline] setCustomBuildProperty 00:00:04.071 [Pipeline] echo 00:00:04.072 Cleanup processes 00:00:04.076 [Pipeline] sh 00:00:04.358 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.358 227288 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.369 [Pipeline] sh 00:00:04.651 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.651 ++ grep -v 'sudo pgrep' 00:00:04.651 ++ awk '{print $1}' 00:00:04.651 + sudo kill -9 00:00:04.651 + true 00:00:04.664 [Pipeline] cleanWs 00:00:04.672 [WS-CLEANUP] Deleting project workspace... 00:00:04.672 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.678 [WS-CLEANUP] done 00:00:04.683 [Pipeline] setCustomBuildProperty 00:00:04.700 [Pipeline] sh 00:00:04.983 + sudo git config --global --replace-all safe.directory '*' 00:00:05.053 [Pipeline] nodesByLabel 00:00:05.054 Found a total of 1 nodes with the 'sorcerer' label 00:00:05.062 [Pipeline] httpRequest 00:00:05.066 HttpMethod: GET 00:00:05.067 URL: http://10.211.164.101/packages/jbp_f7115024b58324eb1821d2923066970ea28490fc.tar.gz 00:00:05.070 Sending request to url: http://10.211.164.101/packages/jbp_f7115024b58324eb1821d2923066970ea28490fc.tar.gz 00:00:05.075 Response Code: HTTP/1.1 200 OK 00:00:05.076 Success: Status code 200 is in the accepted range: 200,404 00:00:05.076 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_f7115024b58324eb1821d2923066970ea28490fc.tar.gz 00:00:05.819 [Pipeline] sh 00:00:06.097 + tar --no-same-owner -xf jbp_f7115024b58324eb1821d2923066970ea28490fc.tar.gz 00:00:06.115 [Pipeline] httpRequest 00:00:06.120 HttpMethod: GET 00:00:06.120 URL: http://10.211.164.101/packages/spdk_65b4e17c6736ae69784017a5d5557443b6997899.tar.gz 00:00:06.121 Sending request to url: http://10.211.164.101/packages/spdk_65b4e17c6736ae69784017a5d5557443b6997899.tar.gz 00:00:06.123 Response Code: HTTP/1.1 200 OK 00:00:06.124 Success: Status code 200 is in the accepted range: 200,404 00:00:06.124 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_65b4e17c6736ae69784017a5d5557443b6997899.tar.gz 00:00:21.747 [Pipeline] sh 00:00:22.032 + tar --no-same-owner -xf spdk_65b4e17c6736ae69784017a5d5557443b6997899.tar.gz 00:00:24.580 [Pipeline] sh 00:00:24.862 + git -C spdk log --oneline -n5 00:00:24.862 65b4e17c6 uuid: clarify spdk_uuid_generate_sha1() return code 00:00:24.862 5d5e4d333 nvmf/rpc: Fail listener add with different secure channel 00:00:24.862 54944c1d1 event: don't NOTICELOG when no RPC server started 00:00:24.862 460a2e391 lib/init: do not fail if missing RPC's subsystem in JSON config doesn't exist in app 00:00:24.862 5dc808124 init: add spdk_subsystem_exists() 00:00:24.874 [Pipeline] } 00:00:24.891 [Pipeline] // stage 00:00:24.898 [Pipeline] stage 00:00:24.900 [Pipeline] { (Prepare) 00:00:24.919 [Pipeline] writeFile 00:00:24.935 [Pipeline] sh 00:00:25.217 + logger -p user.info -t JENKINS-CI 00:00:25.231 [Pipeline] sh 00:00:25.520 + logger -p user.info -t JENKINS-CI 00:00:25.534 [Pipeline] sh 00:00:25.822 + cat autorun-spdk.conf 00:00:25.822 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:25.822 SPDK_TEST_FUZZER_SHORT=1 00:00:25.822 SPDK_TEST_FUZZER=1 00:00:25.822 SPDK_RUN_ASAN=1 00:00:25.822 SPDK_RUN_UBSAN=1 00:00:25.830 RUN_NIGHTLY=0 00:00:25.835 [Pipeline] readFile 00:00:25.862 [Pipeline] withEnv 00:00:25.865 [Pipeline] { 00:00:25.879 [Pipeline] sh 00:00:26.164 + set -ex 00:00:26.164 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:26.164 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:26.164 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:26.164 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:26.164 ++ SPDK_TEST_FUZZER=1 00:00:26.164 ++ SPDK_RUN_ASAN=1 00:00:26.164 ++ SPDK_RUN_UBSAN=1 00:00:26.164 ++ RUN_NIGHTLY=0 00:00:26.164 + case $SPDK_TEST_NVMF_NICS in 00:00:26.164 + DRIVERS= 00:00:26.164 + [[ -n '' ]] 00:00:26.164 + exit 0 00:00:26.174 [Pipeline] } 00:00:26.195 [Pipeline] // withEnv 00:00:26.201 [Pipeline] } 00:00:26.217 [Pipeline] // stage 00:00:26.227 [Pipeline] catchError 00:00:26.228 [Pipeline] { 00:00:26.242 [Pipeline] timeout 00:00:26.243 Timeout set to expire in 30 min 00:00:26.244 [Pipeline] { 00:00:26.256 [Pipeline] stage 00:00:26.257 [Pipeline] { (Tests) 00:00:26.270 [Pipeline] sh 00:00:26.551 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:26.551 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:26.551 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:26.551 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:26.551 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:26.551 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:26.551 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:26.551 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:26.551 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:26.551 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:26.551 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:26.551 + source /etc/os-release 00:00:26.551 ++ NAME='Fedora Linux' 00:00:26.551 ++ VERSION='38 (Cloud Edition)' 00:00:26.551 ++ ID=fedora 00:00:26.551 ++ VERSION_ID=38 00:00:26.551 ++ VERSION_CODENAME= 00:00:26.551 ++ PLATFORM_ID=platform:f38 00:00:26.551 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:26.551 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:26.551 ++ LOGO=fedora-logo-icon 00:00:26.551 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:26.551 ++ HOME_URL=https://fedoraproject.org/ 00:00:26.551 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:26.551 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:26.551 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:26.551 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:26.551 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:26.551 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:26.551 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:26.551 ++ SUPPORT_END=2024-05-14 00:00:26.551 ++ VARIANT='Cloud Edition' 00:00:26.551 ++ VARIANT_ID=cloud 00:00:26.551 + uname -a 00:00:26.551 Linux spdk-wfp-39 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:00:26.551 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:29.845 Hugepages 00:00:29.845 node hugesize free / total 00:00:29.845 node0 1048576kB 0 / 0 00:00:29.845 node0 2048kB 0 / 0 00:00:29.845 node1 1048576kB 0 / 0 00:00:29.845 node1 2048kB 0 / 0 00:00:29.845 00:00:29.845 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:29.845 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:29.845 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:29.845 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:29.845 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:29.845 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:29.845 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:29.845 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:29.845 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:29.845 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:00:29.845 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:29.845 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:29.845 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:29.845 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:29.845 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:29.845 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:29.845 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:29.845 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:29.845 + rm -f /tmp/spdk-ld-path 00:00:29.845 + source autorun-spdk.conf 00:00:29.845 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:29.845 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:29.845 ++ SPDK_TEST_FUZZER=1 00:00:29.845 ++ SPDK_RUN_ASAN=1 00:00:29.845 ++ SPDK_RUN_UBSAN=1 00:00:29.845 ++ RUN_NIGHTLY=0 00:00:29.845 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:29.845 + [[ -n '' ]] 00:00:29.845 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:29.845 + for M in /var/spdk/build-*-manifest.txt 00:00:29.845 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:29.845 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:29.845 + for M in /var/spdk/build-*-manifest.txt 00:00:29.845 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:29.845 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:29.845 ++ uname 00:00:29.845 + [[ Linux == \L\i\n\u\x ]] 00:00:29.845 + sudo dmesg -T 00:00:29.845 + sudo dmesg --clear 00:00:29.845 + dmesg_pid=228835 00:00:29.845 + [[ Fedora Linux == FreeBSD ]] 00:00:29.845 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:29.845 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:29.845 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:29.845 + [[ -x /usr/src/fio-static/fio ]] 00:00:29.845 + sudo dmesg -Tw 00:00:29.845 + export FIO_BIN=/usr/src/fio-static/fio 00:00:29.845 + FIO_BIN=/usr/src/fio-static/fio 00:00:29.845 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:29.845 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:29.845 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:29.845 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:29.845 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:29.845 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:29.845 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:29.845 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:29.845 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:29.845 Test configuration: 00:00:29.845 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:29.845 SPDK_TEST_FUZZER_SHORT=1 00:00:29.845 SPDK_TEST_FUZZER=1 00:00:29.845 SPDK_RUN_ASAN=1 00:00:29.845 SPDK_RUN_UBSAN=1 00:00:29.845 RUN_NIGHTLY=0 11:36:20 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:29.845 11:36:20 -- scripts/common.sh@502 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:29.845 11:36:20 -- scripts/common.sh@510 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:29.845 11:36:20 -- scripts/common.sh@511 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:29.845 11:36:20 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:29.845 11:36:20 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:29.845 11:36:20 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:29.845 11:36:20 -- paths/export.sh@5 -- $ export PATH 00:00:29.845 11:36:20 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:29.845 11:36:20 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:29.845 11:36:20 -- common/autobuild_common.sh@435 -- $ date +%s 00:00:29.845 11:36:20 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1713432980.XXXXXX 00:00:29.845 11:36:20 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1713432980.XrHsPX 00:00:29.845 11:36:20 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:00:29.845 11:36:20 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:00:29.845 11:36:20 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:00:29.845 11:36:20 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:29.845 11:36:20 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:29.845 11:36:20 -- common/autobuild_common.sh@451 -- $ get_config_params 00:00:29.845 11:36:20 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:00:29.845 11:36:20 -- common/autotest_common.sh@10 -- $ set +x 00:00:29.845 11:36:20 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-vfio-user' 00:00:29.845 11:36:20 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:00:29.845 11:36:20 -- pm/common@17 -- $ local monitor 00:00:29.845 11:36:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:29.845 11:36:20 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=228872 00:00:29.845 11:36:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:29.845 11:36:20 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=228874 00:00:29.845 11:36:20 -- pm/common@21 -- $ date +%s 00:00:29.845 11:36:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:29.845 11:36:20 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=228876 00:00:29.845 11:36:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:29.845 11:36:20 -- pm/common@21 -- $ date +%s 00:00:29.845 11:36:20 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=228879 00:00:29.845 11:36:20 -- pm/common@26 -- $ sleep 1 00:00:29.845 11:36:20 -- pm/common@21 -- $ date +%s 00:00:29.845 11:36:20 -- pm/common@21 -- $ date +%s 00:00:29.845 11:36:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713432980 00:00:29.845 11:36:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713432980 00:00:29.845 11:36:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713432980 00:00:29.845 11:36:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713432980 00:00:29.845 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713432980_collect-vmstat.pm.log 00:00:29.845 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713432980_collect-cpu-load.pm.log 00:00:29.845 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713432980_collect-cpu-temp.pm.log 00:00:29.845 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713432980_collect-bmc-pm.bmc.pm.log 00:00:30.786 11:36:21 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:00:30.786 11:36:21 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:30.786 11:36:21 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:30.786 11:36:21 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:30.786 11:36:21 -- spdk/autobuild.sh@16 -- $ date -u 00:00:30.786 Thu Apr 18 09:36:21 AM UTC 2024 00:00:30.786 11:36:21 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:30.786 v24.05-pre-407-g65b4e17c6 00:00:30.786 11:36:21 -- spdk/autobuild.sh@19 -- $ '[' 1 -eq 1 ']' 00:00:30.786 11:36:21 -- spdk/autobuild.sh@20 -- $ run_test asan echo 'using asan' 00:00:30.786 11:36:21 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:00:30.786 11:36:21 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:30.786 11:36:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:31.045 ************************************ 00:00:31.045 START TEST asan 00:00:31.046 ************************************ 00:00:31.046 11:36:21 -- common/autotest_common.sh@1111 -- $ echo 'using asan' 00:00:31.046 using asan 00:00:31.046 00:00:31.046 real 0m0.001s 00:00:31.046 user 0m0.001s 00:00:31.046 sys 0m0.000s 00:00:31.046 11:36:21 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:00:31.046 11:36:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:31.046 ************************************ 00:00:31.046 END TEST asan 00:00:31.046 ************************************ 00:00:31.046 11:36:21 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:31.046 11:36:21 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:31.046 11:36:21 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:00:31.046 11:36:21 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:31.046 11:36:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:31.305 ************************************ 00:00:31.305 START TEST ubsan 00:00:31.305 ************************************ 00:00:31.305 11:36:21 -- common/autotest_common.sh@1111 -- $ echo 'using ubsan' 00:00:31.305 using ubsan 00:00:31.305 00:00:31.305 real 0m0.000s 00:00:31.305 user 0m0.000s 00:00:31.305 sys 0m0.000s 00:00:31.305 11:36:21 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:00:31.305 11:36:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:31.305 ************************************ 00:00:31.305 END TEST ubsan 00:00:31.305 ************************************ 00:00:31.305 11:36:21 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:31.305 11:36:21 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:31.305 11:36:21 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:31.305 11:36:21 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:00:31.305 11:36:21 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:00:31.305 11:36:21 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:00:31.305 11:36:21 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:00:31.305 11:36:21 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:31.305 11:36:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:31.305 ************************************ 00:00:31.305 START TEST autobuild_llvm_precompile 00:00:31.305 ************************************ 00:00:31.305 11:36:21 -- common/autotest_common.sh@1111 -- $ _llvm_precompile 00:00:31.305 11:36:21 -- common/autobuild_common.sh@32 -- $ clang --version 00:00:31.305 11:36:21 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:00:31.305 Target: x86_64-redhat-linux-gnu 00:00:31.305 Thread model: posix 00:00:31.305 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:00:31.305 11:36:21 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:00:31.305 11:36:21 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:00:31.305 11:36:21 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:00:31.305 11:36:21 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:00:31.305 11:36:21 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:00:31.305 11:36:21 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:00:31.565 11:36:21 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:31.565 11:36:21 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:00:31.565 11:36:21 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:00:31.565 11:36:21 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:31.824 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:00:31.824 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:00:32.084 Using 'verbs' RDMA provider 00:00:48.357 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:00.578 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:00.578 Creating mk/config.mk...done. 00:01:00.578 Creating mk/cc.flags.mk...done. 00:01:00.578 Type 'make' to build. 00:01:00.578 00:01:00.578 real 0m29.293s 00:01:00.578 user 0m12.821s 00:01:00.578 sys 0m15.683s 00:01:00.578 11:36:51 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:01:00.578 11:36:51 -- common/autotest_common.sh@10 -- $ set +x 00:01:00.578 ************************************ 00:01:00.578 END TEST autobuild_llvm_precompile 00:01:00.578 ************************************ 00:01:00.838 11:36:51 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:00.838 11:36:51 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:00.838 11:36:51 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:00.838 11:36:51 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:00.838 11:36:51 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:01.097 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:01.097 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:01.356 Using 'verbs' RDMA provider 00:01:14.577 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:26.797 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:26.797 Creating mk/config.mk...done. 00:01:26.797 Creating mk/cc.flags.mk...done. 00:01:26.797 Type 'make' to build. 00:01:26.797 11:37:15 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:26.797 11:37:15 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:26.797 11:37:15 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:26.797 11:37:15 -- common/autotest_common.sh@10 -- $ set +x 00:01:26.797 ************************************ 00:01:26.797 START TEST make 00:01:26.797 ************************************ 00:01:26.797 11:37:15 -- common/autotest_common.sh@1111 -- $ make -j72 00:01:26.797 make[1]: Nothing to be done for 'all'. 00:01:27.366 The Meson build system 00:01:27.366 Version: 1.3.1 00:01:27.366 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:01:27.366 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:27.366 Build type: native build 00:01:27.366 Project name: libvfio-user 00:01:27.366 Project version: 0.0.1 00:01:27.366 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:27.366 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:27.366 Host machine cpu family: x86_64 00:01:27.366 Host machine cpu: x86_64 00:01:27.366 Run-time dependency threads found: YES 00:01:27.366 Library dl found: YES 00:01:27.366 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:27.366 Run-time dependency json-c found: YES 0.17 00:01:27.366 Run-time dependency cmocka found: YES 1.1.7 00:01:27.366 Program pytest-3 found: NO 00:01:27.366 Program flake8 found: NO 00:01:27.366 Program misspell-fixer found: NO 00:01:27.366 Program restructuredtext-lint found: NO 00:01:27.366 Program valgrind found: YES (/usr/bin/valgrind) 00:01:27.366 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:27.366 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:27.366 Compiler for C supports arguments -Wwrite-strings: YES 00:01:27.366 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:27.366 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:01:27.366 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:01:27.366 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:27.366 Build targets in project: 8 00:01:27.366 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:01:27.366 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:01:27.366 00:01:27.366 libvfio-user 0.0.1 00:01:27.366 00:01:27.366 User defined options 00:01:27.366 buildtype : debug 00:01:27.366 default_library: static 00:01:27.366 libdir : /usr/local/lib 00:01:27.366 00:01:27.366 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:27.624 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:27.624 [1/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:01:27.624 [2/36] Compiling C object samples/null.p/null.c.o 00:01:27.624 [3/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:01:27.624 [4/36] Compiling C object samples/lspci.p/lspci.c.o 00:01:27.624 [5/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:01:27.624 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:01:27.624 [7/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:01:27.624 [8/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:01:27.624 [9/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:01:27.624 [10/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:01:27.624 [11/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:01:27.624 [12/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:01:27.624 [13/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:01:27.624 [14/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:01:27.624 [15/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:01:27.624 [16/36] Compiling C object test/unit_tests.p/mocks.c.o 00:01:27.624 [17/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:01:27.624 [18/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:01:27.624 [19/36] Compiling C object samples/server.p/server.c.o 00:01:27.624 [20/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:01:27.625 [21/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:01:27.625 [22/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:01:27.625 [23/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:01:27.625 [24/36] Compiling C object samples/client.p/client.c.o 00:01:27.625 [25/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:01:27.625 [26/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:01:27.625 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:01:27.625 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:01:27.883 [29/36] Linking static target lib/libvfio-user.a 00:01:27.883 [30/36] Linking target samples/client 00:01:27.883 [31/36] Linking target samples/shadow_ioeventfd_server 00:01:27.883 [32/36] Linking target samples/null 00:01:27.883 [33/36] Linking target samples/gpio-pci-idio-16 00:01:27.883 [34/36] Linking target test/unit_tests 00:01:27.883 [35/36] Linking target samples/lspci 00:01:27.883 [36/36] Linking target samples/server 00:01:27.883 INFO: autodetecting backend as ninja 00:01:27.883 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:27.883 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:28.141 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:28.141 ninja: no work to do. 00:01:34.693 The Meson build system 00:01:34.693 Version: 1.3.1 00:01:34.693 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:01:34.693 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:01:34.693 Build type: native build 00:01:34.693 Program cat found: YES (/usr/bin/cat) 00:01:34.693 Project name: DPDK 00:01:34.693 Project version: 23.11.0 00:01:34.693 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:34.693 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:34.693 Host machine cpu family: x86_64 00:01:34.693 Host machine cpu: x86_64 00:01:34.693 Message: ## Building in Developer Mode ## 00:01:34.693 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:34.693 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:34.693 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:34.693 Program python3 found: YES (/usr/bin/python3) 00:01:34.693 Program cat found: YES (/usr/bin/cat) 00:01:34.693 Compiler for C supports arguments -march=native: YES 00:01:34.693 Checking for size of "void *" : 8 00:01:34.693 Checking for size of "void *" : 8 (cached) 00:01:34.693 Library m found: YES 00:01:34.693 Library numa found: YES 00:01:34.693 Has header "numaif.h" : YES 00:01:34.693 Library fdt found: NO 00:01:34.693 Library execinfo found: NO 00:01:34.693 Has header "execinfo.h" : YES 00:01:34.693 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:34.693 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:34.693 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:34.693 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:34.693 Run-time dependency openssl found: YES 3.0.9 00:01:34.693 Run-time dependency libpcap found: YES 1.10.4 00:01:34.693 Has header "pcap.h" with dependency libpcap: YES 00:01:34.693 Compiler for C supports arguments -Wcast-qual: YES 00:01:34.693 Compiler for C supports arguments -Wdeprecated: YES 00:01:34.693 Compiler for C supports arguments -Wformat: YES 00:01:34.693 Compiler for C supports arguments -Wformat-nonliteral: YES 00:01:34.693 Compiler for C supports arguments -Wformat-security: YES 00:01:34.693 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:34.693 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:34.693 Compiler for C supports arguments -Wnested-externs: YES 00:01:34.693 Compiler for C supports arguments -Wold-style-definition: YES 00:01:34.693 Compiler for C supports arguments -Wpointer-arith: YES 00:01:34.693 Compiler for C supports arguments -Wsign-compare: YES 00:01:34.693 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:34.693 Compiler for C supports arguments -Wundef: YES 00:01:34.693 Compiler for C supports arguments -Wwrite-strings: YES 00:01:34.693 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:34.693 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:01:34.693 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:34.693 Program objdump found: YES (/usr/bin/objdump) 00:01:34.693 Compiler for C supports arguments -mavx512f: YES 00:01:34.693 Checking if "AVX512 checking" compiles: YES 00:01:34.693 Fetching value of define "__SSE4_2__" : 1 00:01:34.693 Fetching value of define "__AES__" : 1 00:01:34.693 Fetching value of define "__AVX__" : 1 00:01:34.693 Fetching value of define "__AVX2__" : 1 00:01:34.693 Fetching value of define "__AVX512BW__" : 1 00:01:34.693 Fetching value of define "__AVX512CD__" : 1 00:01:34.693 Fetching value of define "__AVX512DQ__" : 1 00:01:34.693 Fetching value of define "__AVX512F__" : 1 00:01:34.693 Fetching value of define "__AVX512VL__" : 1 00:01:34.693 Fetching value of define "__PCLMUL__" : 1 00:01:34.693 Fetching value of define "__RDRND__" : 1 00:01:34.693 Fetching value of define "__RDSEED__" : 1 00:01:34.693 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:34.693 Fetching value of define "__znver1__" : (undefined) 00:01:34.693 Fetching value of define "__znver2__" : (undefined) 00:01:34.693 Fetching value of define "__znver3__" : (undefined) 00:01:34.693 Fetching value of define "__znver4__" : (undefined) 00:01:34.693 Compiler for C supports arguments -Wno-format-truncation: NO 00:01:34.693 Message: lib/log: Defining dependency "log" 00:01:34.693 Message: lib/kvargs: Defining dependency "kvargs" 00:01:34.693 Message: lib/telemetry: Defining dependency "telemetry" 00:01:34.693 Library rt found: YES 00:01:34.693 Checking for function "getentropy" : NO 00:01:34.693 Message: lib/eal: Defining dependency "eal" 00:01:34.693 Message: lib/ring: Defining dependency "ring" 00:01:34.693 Message: lib/rcu: Defining dependency "rcu" 00:01:34.693 Message: lib/mempool: Defining dependency "mempool" 00:01:34.693 Message: lib/mbuf: Defining dependency "mbuf" 00:01:34.693 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:34.693 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:34.693 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:34.693 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:34.693 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:34.693 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:34.693 Compiler for C supports arguments -mpclmul: YES 00:01:34.693 Compiler for C supports arguments -maes: YES 00:01:34.693 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:34.693 Compiler for C supports arguments -mavx512bw: YES 00:01:34.693 Compiler for C supports arguments -mavx512dq: YES 00:01:34.693 Compiler for C supports arguments -mavx512vl: YES 00:01:34.693 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:34.693 Compiler for C supports arguments -mavx2: YES 00:01:34.693 Compiler for C supports arguments -mavx: YES 00:01:34.693 Message: lib/net: Defining dependency "net" 00:01:34.693 Message: lib/meter: Defining dependency "meter" 00:01:34.693 Message: lib/ethdev: Defining dependency "ethdev" 00:01:34.693 Message: lib/pci: Defining dependency "pci" 00:01:34.693 Message: lib/cmdline: Defining dependency "cmdline" 00:01:34.693 Message: lib/hash: Defining dependency "hash" 00:01:34.693 Message: lib/timer: Defining dependency "timer" 00:01:34.693 Message: lib/compressdev: Defining dependency "compressdev" 00:01:34.693 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:34.693 Message: lib/dmadev: Defining dependency "dmadev" 00:01:34.693 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:34.693 Message: lib/power: Defining dependency "power" 00:01:34.693 Message: lib/reorder: Defining dependency "reorder" 00:01:34.693 Message: lib/security: Defining dependency "security" 00:01:34.693 Has header "linux/userfaultfd.h" : YES 00:01:34.693 Has header "linux/vduse.h" : YES 00:01:34.693 Message: lib/vhost: Defining dependency "vhost" 00:01:34.693 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:01:34.693 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:34.693 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:34.693 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:34.693 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:34.693 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:34.693 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:34.693 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:34.693 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:34.693 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:34.693 Program doxygen found: YES (/usr/bin/doxygen) 00:01:34.693 Configuring doxy-api-html.conf using configuration 00:01:34.693 Configuring doxy-api-man.conf using configuration 00:01:34.693 Program mandb found: YES (/usr/bin/mandb) 00:01:34.693 Program sphinx-build found: NO 00:01:34.693 Configuring rte_build_config.h using configuration 00:01:34.693 Message: 00:01:34.693 ================= 00:01:34.693 Applications Enabled 00:01:34.693 ================= 00:01:34.693 00:01:34.693 apps: 00:01:34.693 00:01:34.693 00:01:34.693 Message: 00:01:34.693 ================= 00:01:34.693 Libraries Enabled 00:01:34.693 ================= 00:01:34.693 00:01:34.693 libs: 00:01:34.693 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:34.693 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:34.693 cryptodev, dmadev, power, reorder, security, vhost, 00:01:34.693 00:01:34.693 Message: 00:01:34.693 =============== 00:01:34.693 Drivers Enabled 00:01:34.693 =============== 00:01:34.693 00:01:34.693 common: 00:01:34.693 00:01:34.693 bus: 00:01:34.693 pci, vdev, 00:01:34.693 mempool: 00:01:34.693 ring, 00:01:34.693 dma: 00:01:34.693 00:01:34.693 net: 00:01:34.693 00:01:34.693 crypto: 00:01:34.693 00:01:34.693 compress: 00:01:34.693 00:01:34.693 vdpa: 00:01:34.693 00:01:34.693 00:01:34.693 Message: 00:01:34.693 ================= 00:01:34.693 Content Skipped 00:01:34.693 ================= 00:01:34.693 00:01:34.693 apps: 00:01:34.693 dumpcap: explicitly disabled via build config 00:01:34.693 graph: explicitly disabled via build config 00:01:34.693 pdump: explicitly disabled via build config 00:01:34.693 proc-info: explicitly disabled via build config 00:01:34.693 test-acl: explicitly disabled via build config 00:01:34.693 test-bbdev: explicitly disabled via build config 00:01:34.693 test-cmdline: explicitly disabled via build config 00:01:34.693 test-compress-perf: explicitly disabled via build config 00:01:34.693 test-crypto-perf: explicitly disabled via build config 00:01:34.693 test-dma-perf: explicitly disabled via build config 00:01:34.693 test-eventdev: explicitly disabled via build config 00:01:34.693 test-fib: explicitly disabled via build config 00:01:34.693 test-flow-perf: explicitly disabled via build config 00:01:34.693 test-gpudev: explicitly disabled via build config 00:01:34.693 test-mldev: explicitly disabled via build config 00:01:34.693 test-pipeline: explicitly disabled via build config 00:01:34.693 test-pmd: explicitly disabled via build config 00:01:34.693 test-regex: explicitly disabled via build config 00:01:34.693 test-sad: explicitly disabled via build config 00:01:34.693 test-security-perf: explicitly disabled via build config 00:01:34.693 00:01:34.693 libs: 00:01:34.694 metrics: explicitly disabled via build config 00:01:34.694 acl: explicitly disabled via build config 00:01:34.694 bbdev: explicitly disabled via build config 00:01:34.694 bitratestats: explicitly disabled via build config 00:01:34.694 bpf: explicitly disabled via build config 00:01:34.694 cfgfile: explicitly disabled via build config 00:01:34.694 distributor: explicitly disabled via build config 00:01:34.694 efd: explicitly disabled via build config 00:01:34.694 eventdev: explicitly disabled via build config 00:01:34.694 dispatcher: explicitly disabled via build config 00:01:34.694 gpudev: explicitly disabled via build config 00:01:34.694 gro: explicitly disabled via build config 00:01:34.694 gso: explicitly disabled via build config 00:01:34.694 ip_frag: explicitly disabled via build config 00:01:34.694 jobstats: explicitly disabled via build config 00:01:34.694 latencystats: explicitly disabled via build config 00:01:34.694 lpm: explicitly disabled via build config 00:01:34.694 member: explicitly disabled via build config 00:01:34.694 pcapng: explicitly disabled via build config 00:01:34.694 rawdev: explicitly disabled via build config 00:01:34.694 regexdev: explicitly disabled via build config 00:01:34.694 mldev: explicitly disabled via build config 00:01:34.694 rib: explicitly disabled via build config 00:01:34.694 sched: explicitly disabled via build config 00:01:34.694 stack: explicitly disabled via build config 00:01:34.694 ipsec: explicitly disabled via build config 00:01:34.694 pdcp: explicitly disabled via build config 00:01:34.694 fib: explicitly disabled via build config 00:01:34.694 port: explicitly disabled via build config 00:01:34.694 pdump: explicitly disabled via build config 00:01:34.694 table: explicitly disabled via build config 00:01:34.694 pipeline: explicitly disabled via build config 00:01:34.694 graph: explicitly disabled via build config 00:01:34.694 node: explicitly disabled via build config 00:01:34.694 00:01:34.694 drivers: 00:01:34.694 common/cpt: not in enabled drivers build config 00:01:34.694 common/dpaax: not in enabled drivers build config 00:01:34.694 common/iavf: not in enabled drivers build config 00:01:34.694 common/idpf: not in enabled drivers build config 00:01:34.694 common/mvep: not in enabled drivers build config 00:01:34.694 common/octeontx: not in enabled drivers build config 00:01:34.694 bus/auxiliary: not in enabled drivers build config 00:01:34.694 bus/cdx: not in enabled drivers build config 00:01:34.694 bus/dpaa: not in enabled drivers build config 00:01:34.694 bus/fslmc: not in enabled drivers build config 00:01:34.694 bus/ifpga: not in enabled drivers build config 00:01:34.694 bus/platform: not in enabled drivers build config 00:01:34.694 bus/vmbus: not in enabled drivers build config 00:01:34.694 common/cnxk: not in enabled drivers build config 00:01:34.694 common/mlx5: not in enabled drivers build config 00:01:34.694 common/nfp: not in enabled drivers build config 00:01:34.694 common/qat: not in enabled drivers build config 00:01:34.694 common/sfc_efx: not in enabled drivers build config 00:01:34.694 mempool/bucket: not in enabled drivers build config 00:01:34.694 mempool/cnxk: not in enabled drivers build config 00:01:34.694 mempool/dpaa: not in enabled drivers build config 00:01:34.694 mempool/dpaa2: not in enabled drivers build config 00:01:34.694 mempool/octeontx: not in enabled drivers build config 00:01:34.694 mempool/stack: not in enabled drivers build config 00:01:34.694 dma/cnxk: not in enabled drivers build config 00:01:34.694 dma/dpaa: not in enabled drivers build config 00:01:34.694 dma/dpaa2: not in enabled drivers build config 00:01:34.694 dma/hisilicon: not in enabled drivers build config 00:01:34.694 dma/idxd: not in enabled drivers build config 00:01:34.694 dma/ioat: not in enabled drivers build config 00:01:34.694 dma/skeleton: not in enabled drivers build config 00:01:34.694 net/af_packet: not in enabled drivers build config 00:01:34.694 net/af_xdp: not in enabled drivers build config 00:01:34.694 net/ark: not in enabled drivers build config 00:01:34.694 net/atlantic: not in enabled drivers build config 00:01:34.694 net/avp: not in enabled drivers build config 00:01:34.694 net/axgbe: not in enabled drivers build config 00:01:34.694 net/bnx2x: not in enabled drivers build config 00:01:34.694 net/bnxt: not in enabled drivers build config 00:01:34.694 net/bonding: not in enabled drivers build config 00:01:34.694 net/cnxk: not in enabled drivers build config 00:01:34.694 net/cpfl: not in enabled drivers build config 00:01:34.694 net/cxgbe: not in enabled drivers build config 00:01:34.694 net/dpaa: not in enabled drivers build config 00:01:34.694 net/dpaa2: not in enabled drivers build config 00:01:34.694 net/e1000: not in enabled drivers build config 00:01:34.694 net/ena: not in enabled drivers build config 00:01:34.694 net/enetc: not in enabled drivers build config 00:01:34.694 net/enetfec: not in enabled drivers build config 00:01:34.694 net/enic: not in enabled drivers build config 00:01:34.694 net/failsafe: not in enabled drivers build config 00:01:34.694 net/fm10k: not in enabled drivers build config 00:01:34.694 net/gve: not in enabled drivers build config 00:01:34.694 net/hinic: not in enabled drivers build config 00:01:34.694 net/hns3: not in enabled drivers build config 00:01:34.694 net/i40e: not in enabled drivers build config 00:01:34.694 net/iavf: not in enabled drivers build config 00:01:34.694 net/ice: not in enabled drivers build config 00:01:34.694 net/idpf: not in enabled drivers build config 00:01:34.694 net/igc: not in enabled drivers build config 00:01:34.694 net/ionic: not in enabled drivers build config 00:01:34.694 net/ipn3ke: not in enabled drivers build config 00:01:34.694 net/ixgbe: not in enabled drivers build config 00:01:34.694 net/mana: not in enabled drivers build config 00:01:34.694 net/memif: not in enabled drivers build config 00:01:34.694 net/mlx4: not in enabled drivers build config 00:01:34.694 net/mlx5: not in enabled drivers build config 00:01:34.694 net/mvneta: not in enabled drivers build config 00:01:34.694 net/mvpp2: not in enabled drivers build config 00:01:34.694 net/netvsc: not in enabled drivers build config 00:01:34.694 net/nfb: not in enabled drivers build config 00:01:34.694 net/nfp: not in enabled drivers build config 00:01:34.694 net/ngbe: not in enabled drivers build config 00:01:34.694 net/null: not in enabled drivers build config 00:01:34.694 net/octeontx: not in enabled drivers build config 00:01:34.694 net/octeon_ep: not in enabled drivers build config 00:01:34.694 net/pcap: not in enabled drivers build config 00:01:34.694 net/pfe: not in enabled drivers build config 00:01:34.694 net/qede: not in enabled drivers build config 00:01:34.694 net/ring: not in enabled drivers build config 00:01:34.694 net/sfc: not in enabled drivers build config 00:01:34.694 net/softnic: not in enabled drivers build config 00:01:34.694 net/tap: not in enabled drivers build config 00:01:34.694 net/thunderx: not in enabled drivers build config 00:01:34.694 net/txgbe: not in enabled drivers build config 00:01:34.694 net/vdev_netvsc: not in enabled drivers build config 00:01:34.694 net/vhost: not in enabled drivers build config 00:01:34.694 net/virtio: not in enabled drivers build config 00:01:34.694 net/vmxnet3: not in enabled drivers build config 00:01:34.694 raw/*: missing internal dependency, "rawdev" 00:01:34.694 crypto/armv8: not in enabled drivers build config 00:01:34.694 crypto/bcmfs: not in enabled drivers build config 00:01:34.694 crypto/caam_jr: not in enabled drivers build config 00:01:34.694 crypto/ccp: not in enabled drivers build config 00:01:34.694 crypto/cnxk: not in enabled drivers build config 00:01:34.694 crypto/dpaa_sec: not in enabled drivers build config 00:01:34.694 crypto/dpaa2_sec: not in enabled drivers build config 00:01:34.694 crypto/ipsec_mb: not in enabled drivers build config 00:01:34.694 crypto/mlx5: not in enabled drivers build config 00:01:34.694 crypto/mvsam: not in enabled drivers build config 00:01:34.694 crypto/nitrox: not in enabled drivers build config 00:01:34.694 crypto/null: not in enabled drivers build config 00:01:34.694 crypto/octeontx: not in enabled drivers build config 00:01:34.694 crypto/openssl: not in enabled drivers build config 00:01:34.694 crypto/scheduler: not in enabled drivers build config 00:01:34.694 crypto/uadk: not in enabled drivers build config 00:01:34.694 crypto/virtio: not in enabled drivers build config 00:01:34.694 compress/isal: not in enabled drivers build config 00:01:34.694 compress/mlx5: not in enabled drivers build config 00:01:34.694 compress/octeontx: not in enabled drivers build config 00:01:34.694 compress/zlib: not in enabled drivers build config 00:01:34.694 regex/*: missing internal dependency, "regexdev" 00:01:34.694 ml/*: missing internal dependency, "mldev" 00:01:34.694 vdpa/ifc: not in enabled drivers build config 00:01:34.694 vdpa/mlx5: not in enabled drivers build config 00:01:34.694 vdpa/nfp: not in enabled drivers build config 00:01:34.694 vdpa/sfc: not in enabled drivers build config 00:01:34.694 event/*: missing internal dependency, "eventdev" 00:01:34.694 baseband/*: missing internal dependency, "bbdev" 00:01:34.694 gpu/*: missing internal dependency, "gpudev" 00:01:34.694 00:01:34.694 00:01:34.694 Build targets in project: 85 00:01:34.694 00:01:34.694 DPDK 23.11.0 00:01:34.694 00:01:34.694 User defined options 00:01:34.694 buildtype : debug 00:01:34.694 default_library : static 00:01:34.694 libdir : lib 00:01:34.694 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:34.694 b_lundef : false 00:01:34.694 b_sanitize : address 00:01:34.694 c_args : -fPIC -Werror 00:01:34.694 c_link_args : 00:01:34.694 cpu_instruction_set: native 00:01:34.694 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:01:34.694 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:01:34.694 enable_docs : false 00:01:34.694 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:34.694 enable_kmods : false 00:01:34.694 tests : false 00:01:34.694 00:01:34.694 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:34.694 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:01:34.694 [1/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:34.694 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:34.694 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:34.694 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:34.694 [5/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:34.694 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:34.694 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:34.694 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:34.694 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:34.694 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:34.694 [11/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:34.694 [12/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:34.694 [13/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:34.694 [14/265] Linking static target lib/librte_kvargs.a 00:01:34.694 [15/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:34.694 [16/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:34.694 [17/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:34.694 [18/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:34.694 [19/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:34.694 [20/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:34.694 [21/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:34.694 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:34.694 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:34.694 [24/265] Linking static target lib/librte_log.a 00:01:34.694 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:34.954 [26/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.954 [27/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:34.954 [28/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:34.954 [29/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:34.954 [30/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:35.218 [31/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:35.218 [32/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:35.218 [33/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:35.218 [34/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:35.218 [35/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:35.218 [36/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:35.218 [37/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:35.218 [38/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:35.218 [39/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:35.218 [40/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:35.218 [41/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:35.218 [42/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:35.218 [43/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:35.218 [44/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:35.218 [45/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:35.218 [46/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:35.218 [47/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:35.218 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:35.218 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:35.218 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:35.218 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:35.218 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:35.218 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:35.218 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:35.218 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:35.218 [56/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:35.218 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:35.218 [58/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:35.218 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:35.218 [60/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:35.218 [61/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:35.218 [62/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:35.218 [63/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:35.218 [64/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:35.218 [65/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:35.218 [66/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:35.218 [67/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:35.218 [68/265] Linking static target lib/librte_telemetry.a 00:01:35.218 [69/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:35.218 [70/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:35.218 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:35.218 [72/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:35.218 [73/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:35.218 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:35.218 [75/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:35.218 [76/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:35.218 [77/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:35.218 [78/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:35.218 [79/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:35.218 [80/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:35.218 [81/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:35.218 [82/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:35.218 [83/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:35.219 [84/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:35.219 [85/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:35.219 [86/265] Linking static target lib/librte_pci.a 00:01:35.219 [87/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:35.219 [88/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.219 [89/265] Linking static target lib/librte_meter.a 00:01:35.219 [90/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:35.219 [91/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:35.219 [92/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:35.219 [93/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:35.219 [94/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:35.219 [95/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:35.219 [96/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:35.219 [97/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:35.219 [98/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:35.219 [99/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:35.219 [100/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:35.219 [101/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:35.219 [102/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:35.219 [103/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:35.219 [104/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:35.219 [105/265] Linking static target lib/librte_ring.a 00:01:35.219 [106/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:35.219 [107/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:35.219 [108/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:35.219 [109/265] Linking target lib/librte_log.so.24.0 00:01:35.219 [110/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:35.477 [111/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:35.477 [112/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:35.477 [113/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:35.477 [114/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:35.477 [115/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:35.477 [116/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:35.477 [117/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:35.477 [118/265] Linking static target lib/librte_mempool.a 00:01:35.477 [119/265] Linking static target lib/librte_eal.a 00:01:35.477 [120/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:35.477 [121/265] Linking static target lib/librte_net.a 00:01:35.477 [122/265] Linking static target lib/librte_rcu.a 00:01:35.477 [123/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.477 [124/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.477 [125/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:35.477 [126/265] Linking target lib/librte_kvargs.so.24.0 00:01:35.477 [127/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.736 [128/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.736 [129/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:35.736 [130/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:35.736 [131/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:35.736 [132/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.736 [133/265] Linking target lib/librte_telemetry.so.24.0 00:01:35.736 [134/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:35.736 [135/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:35.736 [136/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:35.736 [137/265] Linking static target lib/librte_mbuf.a 00:01:35.736 [138/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:35.736 [139/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:35.736 [140/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.736 [141/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:35.736 [142/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:35.736 [143/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:35.736 [144/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:35.736 [145/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:35.736 [146/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:35.736 [147/265] Linking static target lib/librte_cmdline.a 00:01:35.736 [148/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:35.736 [149/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:35.736 [150/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:35.736 [151/265] Linking static target lib/librte_timer.a 00:01:35.736 [152/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:35.736 [153/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:35.736 [154/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:35.736 [155/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:35.736 [156/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:35.736 [157/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:35.736 [158/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:35.736 [159/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:35.736 [160/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:35.736 [161/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:35.736 [162/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:35.736 [163/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:35.996 [164/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:35.996 [165/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:35.996 [166/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:35.996 [167/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:35.996 [168/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:35.996 [169/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:35.996 [170/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:35.996 [171/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:35.996 [172/265] Linking static target lib/librte_compressdev.a 00:01:35.996 [173/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:35.996 [174/265] Linking static target lib/librte_dmadev.a 00:01:35.996 [175/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:35.996 [176/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:35.996 [177/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:35.996 [178/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:35.996 [179/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:35.996 [180/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:35.996 [181/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:35.996 [182/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:35.996 [183/265] Linking static target lib/librte_power.a 00:01:35.996 [184/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:35.996 [185/265] Linking static target lib/librte_reorder.a 00:01:35.996 [186/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:35.996 [187/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:35.996 [188/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:35.996 [189/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:35.996 [190/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:35.996 [191/265] Linking static target lib/librte_security.a 00:01:35.996 [192/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:35.996 [193/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:35.996 [194/265] Linking static target lib/librte_hash.a 00:01:35.996 [195/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:35.996 [196/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:35.996 [197/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:36.254 [198/265] Linking static target drivers/librte_bus_vdev.a 00:01:36.254 [199/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.254 [200/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:36.254 [201/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:36.254 [202/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:36.254 [203/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:36.254 [204/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:36.254 [205/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:36.254 [206/265] Linking static target drivers/librte_bus_pci.a 00:01:36.254 [207/265] Linking static target drivers/librte_mempool_ring.a 00:01:36.254 [208/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.254 [209/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:36.254 [210/265] Linking static target lib/librte_cryptodev.a 00:01:36.254 [211/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:36.511 [212/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.511 [213/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.511 [214/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.511 [215/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.511 [216/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:36.511 [217/265] Linking static target lib/librte_ethdev.a 00:01:36.511 [218/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.770 [219/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.770 [220/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:37.029 [221/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.029 [222/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.029 [223/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.029 [224/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.403 [225/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:38.403 [226/265] Linking static target lib/librte_vhost.a 00:01:38.403 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.320 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.591 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.127 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.127 [231/265] Linking target lib/librte_eal.so.24.0 00:01:48.386 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:48.386 [233/265] Linking target lib/librte_timer.so.24.0 00:01:48.386 [234/265] Linking target lib/librte_ring.so.24.0 00:01:48.386 [235/265] Linking target lib/librte_meter.so.24.0 00:01:48.386 [236/265] Linking target lib/librte_pci.so.24.0 00:01:48.386 [237/265] Linking target lib/librte_dmadev.so.24.0 00:01:48.386 [238/265] Linking target drivers/librte_bus_vdev.so.24.0 00:01:48.386 [239/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:48.386 [240/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:48.386 [241/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:48.386 [242/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:48.386 [243/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:48.645 [244/265] Linking target lib/librte_rcu.so.24.0 00:01:48.645 [245/265] Linking target lib/librte_mempool.so.24.0 00:01:48.645 [246/265] Linking target drivers/librte_bus_pci.so.24.0 00:01:48.645 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:48.645 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:48.645 [249/265] Linking target drivers/librte_mempool_ring.so.24.0 00:01:48.903 [250/265] Linking target lib/librte_mbuf.so.24.0 00:01:48.903 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:48.903 [252/265] Linking target lib/librte_compressdev.so.24.0 00:01:48.903 [253/265] Linking target lib/librte_net.so.24.0 00:01:48.903 [254/265] Linking target lib/librte_cryptodev.so.24.0 00:01:48.903 [255/265] Linking target lib/librte_reorder.so.24.0 00:01:49.161 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:49.161 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:49.161 [258/265] Linking target lib/librte_cmdline.so.24.0 00:01:49.161 [259/265] Linking target lib/librte_hash.so.24.0 00:01:49.161 [260/265] Linking target lib/librte_ethdev.so.24.0 00:01:49.161 [261/265] Linking target lib/librte_security.so.24.0 00:01:49.419 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:49.419 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:49.419 [264/265] Linking target lib/librte_power.so.24.0 00:01:49.419 [265/265] Linking target lib/librte_vhost.so.24.0 00:01:49.419 INFO: autodetecting backend as ninja 00:01:49.419 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 72 00:01:50.354 CC lib/ut_mock/mock.o 00:01:50.354 CC lib/log/log_flags.o 00:01:50.354 CC lib/log/log_deprecated.o 00:01:50.354 CC lib/log/log.o 00:01:50.354 CC lib/ut/ut.o 00:01:50.612 LIB libspdk_ut_mock.a 00:01:50.612 LIB libspdk_ut.a 00:01:50.612 LIB libspdk_log.a 00:01:50.870 CXX lib/trace_parser/trace.o 00:01:50.870 CC lib/dma/dma.o 00:01:50.870 CC lib/util/base64.o 00:01:50.870 CC lib/util/bit_array.o 00:01:50.870 CC lib/util/crc16.o 00:01:50.870 CC lib/util/cpuset.o 00:01:50.870 CC lib/util/crc32c.o 00:01:50.870 CC lib/ioat/ioat.o 00:01:50.870 CC lib/util/crc32.o 00:01:50.870 CC lib/util/crc32_ieee.o 00:01:50.870 CC lib/util/crc64.o 00:01:50.870 CC lib/util/dif.o 00:01:50.870 CC lib/util/fd.o 00:01:50.870 CC lib/util/file.o 00:01:50.870 CC lib/util/hexlify.o 00:01:50.870 CC lib/util/iov.o 00:01:50.870 CC lib/util/math.o 00:01:50.870 CC lib/util/pipe.o 00:01:50.870 CC lib/util/strerror_tls.o 00:01:50.870 CC lib/util/string.o 00:01:50.870 CC lib/util/fd_group.o 00:01:50.870 CC lib/util/uuid.o 00:01:50.870 CC lib/util/xor.o 00:01:50.870 CC lib/util/zipf.o 00:01:50.870 CC lib/vfio_user/host/vfio_user.o 00:01:50.870 CC lib/vfio_user/host/vfio_user_pci.o 00:01:51.128 LIB libspdk_dma.a 00:01:51.128 LIB libspdk_ioat.a 00:01:51.128 LIB libspdk_vfio_user.a 00:01:51.385 LIB libspdk_util.a 00:01:51.385 LIB libspdk_trace_parser.a 00:01:51.643 CC lib/conf/conf.o 00:01:51.643 CC lib/vmd/vmd.o 00:01:51.643 CC lib/vmd/led.o 00:01:51.643 CC lib/env_dpdk/env.o 00:01:51.643 CC lib/env_dpdk/memory.o 00:01:51.643 CC lib/env_dpdk/pci.o 00:01:51.643 CC lib/env_dpdk/init.o 00:01:51.643 CC lib/env_dpdk/threads.o 00:01:51.643 CC lib/env_dpdk/pci_virtio.o 00:01:51.643 CC lib/json/json_parse.o 00:01:51.643 CC lib/env_dpdk/pci_ioat.o 00:01:51.643 CC lib/json/json_write.o 00:01:51.643 CC lib/json/json_util.o 00:01:51.643 CC lib/env_dpdk/pci_vmd.o 00:01:51.643 CC lib/rdma/common.o 00:01:51.643 CC lib/env_dpdk/pci_idxd.o 00:01:51.643 CC lib/rdma/rdma_verbs.o 00:01:51.643 CC lib/env_dpdk/pci_event.o 00:01:51.643 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:51.643 CC lib/env_dpdk/sigbus_handler.o 00:01:51.643 CC lib/env_dpdk/pci_dpdk.o 00:01:51.643 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:51.643 CC lib/idxd/idxd.o 00:01:51.644 CC lib/idxd/idxd_user.o 00:01:51.644 LIB libspdk_conf.a 00:01:51.901 LIB libspdk_json.a 00:01:51.901 LIB libspdk_rdma.a 00:01:51.901 LIB libspdk_idxd.a 00:01:52.160 LIB libspdk_vmd.a 00:01:52.160 CC lib/jsonrpc/jsonrpc_server.o 00:01:52.160 CC lib/jsonrpc/jsonrpc_client.o 00:01:52.160 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:01:52.160 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:01:52.419 LIB libspdk_jsonrpc.a 00:01:52.677 CC lib/rpc/rpc.o 00:01:52.677 LIB libspdk_env_dpdk.a 00:01:52.677 LIB libspdk_rpc.a 00:01:53.242 CC lib/trace/trace_flags.o 00:01:53.242 CC lib/trace/trace.o 00:01:53.242 CC lib/keyring/keyring.o 00:01:53.242 CC lib/keyring/keyring_rpc.o 00:01:53.242 CC lib/trace/trace_rpc.o 00:01:53.242 CC lib/notify/notify.o 00:01:53.242 CC lib/notify/notify_rpc.o 00:01:53.242 LIB libspdk_notify.a 00:01:53.242 LIB libspdk_trace.a 00:01:53.242 LIB libspdk_keyring.a 00:01:53.500 CC lib/thread/thread.o 00:01:53.500 CC lib/thread/iobuf.o 00:01:53.500 CC lib/sock/sock.o 00:01:53.500 CC lib/sock/sock_rpc.o 00:01:53.757 LIB libspdk_sock.a 00:01:54.323 CC lib/nvme/nvme_ctrlr.o 00:01:54.323 CC lib/nvme/nvme_ctrlr_cmd.o 00:01:54.323 CC lib/nvme/nvme_fabric.o 00:01:54.323 CC lib/nvme/nvme_ns_cmd.o 00:01:54.323 CC lib/nvme/nvme_ns.o 00:01:54.323 CC lib/nvme/nvme_pcie_common.o 00:01:54.323 CC lib/nvme/nvme_pcie.o 00:01:54.323 CC lib/nvme/nvme_qpair.o 00:01:54.323 CC lib/nvme/nvme.o 00:01:54.323 CC lib/nvme/nvme_quirks.o 00:01:54.323 CC lib/nvme/nvme_transport.o 00:01:54.323 CC lib/nvme/nvme_discovery.o 00:01:54.323 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:01:54.323 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:01:54.323 CC lib/nvme/nvme_tcp.o 00:01:54.323 CC lib/nvme/nvme_opal.o 00:01:54.323 CC lib/nvme/nvme_io_msg.o 00:01:54.323 CC lib/nvme/nvme_poll_group.o 00:01:54.323 CC lib/nvme/nvme_zns.o 00:01:54.323 CC lib/nvme/nvme_stubs.o 00:01:54.323 CC lib/nvme/nvme_auth.o 00:01:54.323 CC lib/nvme/nvme_cuse.o 00:01:54.323 CC lib/nvme/nvme_vfio_user.o 00:01:54.323 CC lib/nvme/nvme_rdma.o 00:01:54.580 LIB libspdk_thread.a 00:01:54.838 CC lib/blob/blobstore.o 00:01:54.838 CC lib/blob/zeroes.o 00:01:54.838 CC lib/blob/request.o 00:01:54.838 CC lib/blob/blob_bs_dev.o 00:01:54.838 CC lib/virtio/virtio_vhost_user.o 00:01:54.838 CC lib/virtio/virtio.o 00:01:54.838 CC lib/virtio/virtio_vfio_user.o 00:01:54.838 CC lib/virtio/virtio_pci.o 00:01:54.838 CC lib/init/json_config.o 00:01:54.838 CC lib/init/subsystem.o 00:01:54.838 CC lib/init/subsystem_rpc.o 00:01:54.838 CC lib/init/rpc.o 00:01:54.838 CC lib/vfu_tgt/tgt_endpoint.o 00:01:54.838 CC lib/vfu_tgt/tgt_rpc.o 00:01:54.838 CC lib/accel/accel.o 00:01:54.838 CC lib/accel/accel_rpc.o 00:01:54.838 CC lib/accel/accel_sw.o 00:01:54.838 LIB libspdk_init.a 00:01:55.119 LIB libspdk_virtio.a 00:01:55.119 LIB libspdk_vfu_tgt.a 00:01:55.119 CC lib/event/log_rpc.o 00:01:55.119 CC lib/event/app.o 00:01:55.119 CC lib/event/reactor.o 00:01:55.416 CC lib/event/app_rpc.o 00:01:55.416 CC lib/event/scheduler_static.o 00:01:55.416 LIB libspdk_event.a 00:01:55.695 LIB libspdk_accel.a 00:01:55.695 LIB libspdk_nvme.a 00:01:55.954 CC lib/bdev/bdev_rpc.o 00:01:55.954 CC lib/bdev/bdev.o 00:01:55.954 CC lib/bdev/part.o 00:01:55.954 CC lib/bdev/scsi_nvme.o 00:01:55.954 CC lib/bdev/bdev_zone.o 00:01:56.891 LIB libspdk_blob.a 00:01:57.149 CC lib/lvol/lvol.o 00:01:57.149 CC lib/blobfs/blobfs.o 00:01:57.149 CC lib/blobfs/tree.o 00:01:57.717 LIB libspdk_lvol.a 00:01:57.717 LIB libspdk_blobfs.a 00:01:57.976 LIB libspdk_bdev.a 00:01:58.239 CC lib/ublk/ublk.o 00:01:58.239 CC lib/ublk/ublk_rpc.o 00:01:58.239 CC lib/scsi/lun.o 00:01:58.239 CC lib/scsi/dev.o 00:01:58.239 CC lib/scsi/scsi_bdev.o 00:01:58.239 CC lib/scsi/port.o 00:01:58.239 CC lib/scsi/scsi.o 00:01:58.239 CC lib/scsi/scsi_pr.o 00:01:58.239 CC lib/scsi/scsi_rpc.o 00:01:58.239 CC lib/scsi/task.o 00:01:58.239 CC lib/nbd/nbd_rpc.o 00:01:58.239 CC lib/ftl/ftl_core.o 00:01:58.239 CC lib/nbd/nbd.o 00:01:58.239 CC lib/ftl/ftl_init.o 00:01:58.239 CC lib/ftl/ftl_layout.o 00:01:58.239 CC lib/ftl/ftl_debug.o 00:01:58.239 CC lib/ftl/ftl_io.o 00:01:58.239 CC lib/ftl/ftl_sb.o 00:01:58.239 CC lib/ftl/ftl_l2p.o 00:01:58.239 CC lib/ftl/ftl_l2p_flat.o 00:01:58.239 CC lib/ftl/ftl_nv_cache.o 00:01:58.239 CC lib/nvmf/ctrlr.o 00:01:58.239 CC lib/ftl/ftl_band.o 00:01:58.239 CC lib/nvmf/ctrlr_discovery.o 00:01:58.239 CC lib/ftl/ftl_rq.o 00:01:58.239 CC lib/ftl/ftl_band_ops.o 00:01:58.239 CC lib/nvmf/ctrlr_bdev.o 00:01:58.239 CC lib/ftl/ftl_writer.o 00:01:58.239 CC lib/nvmf/subsystem.o 00:01:58.239 CC lib/nvmf/nvmf.o 00:01:58.239 CC lib/ftl/ftl_reloc.o 00:01:58.239 CC lib/nvmf/nvmf_rpc.o 00:01:58.239 CC lib/ftl/ftl_l2p_cache.o 00:01:58.239 CC lib/ftl/ftl_p2l.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt.o 00:01:58.239 CC lib/nvmf/transport.o 00:01:58.239 CC lib/nvmf/tcp.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:01:58.239 CC lib/nvmf/vfio_user.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:01:58.239 CC lib/nvmf/rdma.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_misc.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_startup.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_md.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_band.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:01:58.239 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:01:58.239 CC lib/ftl/utils/ftl_conf.o 00:01:58.239 CC lib/ftl/utils/ftl_md.o 00:01:58.239 CC lib/ftl/utils/ftl_mempool.o 00:01:58.239 CC lib/ftl/utils/ftl_property.o 00:01:58.239 CC lib/ftl/utils/ftl_bitmap.o 00:01:58.239 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:01:58.239 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:01:58.239 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:01:58.239 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:01:58.239 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:01:58.239 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:01:58.239 CC lib/ftl/upgrade/ftl_sb_v3.o 00:01:58.239 CC lib/ftl/upgrade/ftl_sb_v5.o 00:01:58.239 CC lib/ftl/nvc/ftl_nvc_dev.o 00:01:58.239 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:01:58.239 CC lib/ftl/base/ftl_base_dev.o 00:01:58.239 CC lib/ftl/base/ftl_base_bdev.o 00:01:58.239 CC lib/ftl/ftl_trace.o 00:01:58.808 LIB libspdk_nbd.a 00:01:58.808 LIB libspdk_scsi.a 00:01:58.808 LIB libspdk_ublk.a 00:01:59.066 LIB libspdk_ftl.a 00:01:59.066 CC lib/iscsi/conn.o 00:01:59.066 CC lib/iscsi/init_grp.o 00:01:59.066 CC lib/iscsi/iscsi.o 00:01:59.066 CC lib/iscsi/md5.o 00:01:59.066 CC lib/iscsi/param.o 00:01:59.066 CC lib/iscsi/portal_grp.o 00:01:59.066 CC lib/iscsi/tgt_node.o 00:01:59.066 CC lib/iscsi/iscsi_subsystem.o 00:01:59.066 CC lib/iscsi/iscsi_rpc.o 00:01:59.066 CC lib/iscsi/task.o 00:01:59.066 CC lib/vhost/vhost.o 00:01:59.066 CC lib/vhost/vhost_rpc.o 00:01:59.066 CC lib/vhost/vhost_scsi.o 00:01:59.066 CC lib/vhost/vhost_blk.o 00:01:59.066 CC lib/vhost/rte_vhost_user.o 00:01:59.634 LIB libspdk_nvmf.a 00:01:59.634 LIB libspdk_vhost.a 00:01:59.893 LIB libspdk_iscsi.a 00:02:00.460 CC module/env_dpdk/env_dpdk_rpc.o 00:02:00.460 CC module/vfu_device/vfu_virtio.o 00:02:00.460 CC module/vfu_device/vfu_virtio_scsi.o 00:02:00.460 CC module/vfu_device/vfu_virtio_rpc.o 00:02:00.460 CC module/vfu_device/vfu_virtio_blk.o 00:02:00.460 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:00.460 CC module/accel/dsa/accel_dsa.o 00:02:00.460 CC module/accel/dsa/accel_dsa_rpc.o 00:02:00.460 LIB libspdk_env_dpdk_rpc.a 00:02:00.460 CC module/accel/ioat/accel_ioat.o 00:02:00.460 CC module/accel/ioat/accel_ioat_rpc.o 00:02:00.460 CC module/scheduler/gscheduler/gscheduler.o 00:02:00.460 CC module/blob/bdev/blob_bdev.o 00:02:00.460 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:00.460 CC module/accel/error/accel_error_rpc.o 00:02:00.460 CC module/accel/error/accel_error.o 00:02:00.460 CC module/accel/iaa/accel_iaa.o 00:02:00.460 CC module/accel/iaa/accel_iaa_rpc.o 00:02:00.460 CC module/keyring/file/keyring.o 00:02:00.460 CC module/keyring/file/keyring_rpc.o 00:02:00.460 CC module/sock/posix/posix.o 00:02:00.718 LIB libspdk_scheduler_dpdk_governor.a 00:02:00.718 LIB libspdk_scheduler_gscheduler.a 00:02:00.718 LIB libspdk_keyring_file.a 00:02:00.718 LIB libspdk_accel_error.a 00:02:00.718 LIB libspdk_scheduler_dynamic.a 00:02:00.718 LIB libspdk_accel_ioat.a 00:02:00.718 LIB libspdk_accel_iaa.a 00:02:00.718 LIB libspdk_blob_bdev.a 00:02:00.718 LIB libspdk_accel_dsa.a 00:02:00.978 LIB libspdk_vfu_device.a 00:02:00.978 CC module/bdev/malloc/bdev_malloc.o 00:02:00.978 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:00.978 CC module/bdev/gpt/gpt.o 00:02:00.978 CC module/bdev/gpt/vbdev_gpt.o 00:02:00.978 LIB libspdk_sock_posix.a 00:02:00.978 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:00.978 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:00.978 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:00.978 CC module/bdev/raid/bdev_raid.o 00:02:00.978 CC module/bdev/error/vbdev_error.o 00:02:00.978 CC module/bdev/lvol/vbdev_lvol.o 00:02:00.978 CC module/bdev/raid/bdev_raid_sb.o 00:02:00.978 CC module/bdev/raid/bdev_raid_rpc.o 00:02:00.978 CC module/bdev/error/vbdev_error_rpc.o 00:02:00.978 CC module/bdev/raid/concat.o 00:02:00.978 CC module/bdev/raid/raid0.o 00:02:00.978 CC module/bdev/raid/raid1.o 00:02:00.978 CC module/blobfs/bdev/blobfs_bdev.o 00:02:00.978 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:01.237 CC module/bdev/passthru/vbdev_passthru.o 00:02:01.237 CC module/bdev/aio/bdev_aio_rpc.o 00:02:01.237 CC module/bdev/aio/bdev_aio.o 00:02:01.237 CC module/bdev/null/bdev_null.o 00:02:01.237 CC module/bdev/null/bdev_null_rpc.o 00:02:01.237 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:01.237 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:01.237 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:01.237 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:01.237 CC module/bdev/iscsi/bdev_iscsi.o 00:02:01.237 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:01.237 CC module/bdev/split/vbdev_split.o 00:02:01.237 CC module/bdev/delay/vbdev_delay.o 00:02:01.237 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:01.237 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:01.237 CC module/bdev/nvme/bdev_nvme.o 00:02:01.237 CC module/bdev/split/vbdev_split_rpc.o 00:02:01.237 CC module/bdev/nvme/nvme_rpc.o 00:02:01.238 CC module/bdev/nvme/bdev_mdns_client.o 00:02:01.238 CC module/bdev/nvme/vbdev_opal.o 00:02:01.238 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:01.238 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:01.238 CC module/bdev/ftl/bdev_ftl.o 00:02:01.238 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:01.238 LIB libspdk_bdev_gpt.a 00:02:01.238 LIB libspdk_bdev_error.a 00:02:01.238 LIB libspdk_bdev_split.a 00:02:01.238 LIB libspdk_blobfs_bdev.a 00:02:01.496 LIB libspdk_bdev_passthru.a 00:02:01.496 LIB libspdk_bdev_zone_block.a 00:02:01.496 LIB libspdk_bdev_aio.a 00:02:01.496 LIB libspdk_bdev_null.a 00:02:01.496 LIB libspdk_bdev_malloc.a 00:02:01.496 LIB libspdk_bdev_iscsi.a 00:02:01.496 LIB libspdk_bdev_delay.a 00:02:01.496 LIB libspdk_bdev_ftl.a 00:02:01.496 LIB libspdk_bdev_virtio.a 00:02:01.496 LIB libspdk_bdev_lvol.a 00:02:01.755 LIB libspdk_bdev_raid.a 00:02:02.694 LIB libspdk_bdev_nvme.a 00:02:03.262 CC module/event/subsystems/iobuf/iobuf.o 00:02:03.262 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:03.262 CC module/event/subsystems/sock/sock.o 00:02:03.262 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:03.262 CC module/event/subsystems/scheduler/scheduler.o 00:02:03.262 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:03.262 CC module/event/subsystems/keyring/keyring.o 00:02:03.262 CC module/event/subsystems/vmd/vmd.o 00:02:03.262 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:03.262 LIB libspdk_event_iobuf.a 00:02:03.262 LIB libspdk_event_sock.a 00:02:03.262 LIB libspdk_event_vfu_tgt.a 00:02:03.262 LIB libspdk_event_keyring.a 00:02:03.262 LIB libspdk_event_scheduler.a 00:02:03.262 LIB libspdk_event_vhost_blk.a 00:02:03.262 LIB libspdk_event_vmd.a 00:02:03.521 CC module/event/subsystems/accel/accel.o 00:02:03.779 LIB libspdk_event_accel.a 00:02:04.037 CC module/event/subsystems/bdev/bdev.o 00:02:04.037 LIB libspdk_event_bdev.a 00:02:04.604 CC module/event/subsystems/nbd/nbd.o 00:02:04.604 CC module/event/subsystems/ublk/ublk.o 00:02:04.604 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:04.604 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:04.604 CC module/event/subsystems/scsi/scsi.o 00:02:04.604 LIB libspdk_event_ublk.a 00:02:04.604 LIB libspdk_event_nbd.a 00:02:04.604 LIB libspdk_event_scsi.a 00:02:04.604 LIB libspdk_event_nvmf.a 00:02:04.863 CC module/event/subsystems/iscsi/iscsi.o 00:02:04.863 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:05.122 LIB libspdk_event_vhost_scsi.a 00:02:05.122 LIB libspdk_event_iscsi.a 00:02:05.390 CC app/spdk_top/spdk_top.o 00:02:05.390 CC app/spdk_nvme_identify/identify.o 00:02:05.390 CC app/spdk_lspci/spdk_lspci.o 00:02:05.390 CC app/trace_record/trace_record.o 00:02:05.390 CXX app/trace/trace.o 00:02:05.390 CC app/spdk_nvme_perf/perf.o 00:02:05.390 CC app/spdk_nvme_discover/discovery_aer.o 00:02:05.390 CC test/rpc_client/rpc_client_test.o 00:02:05.390 TEST_HEADER include/spdk/accel.h 00:02:05.390 TEST_HEADER include/spdk/accel_module.h 00:02:05.390 TEST_HEADER include/spdk/assert.h 00:02:05.390 TEST_HEADER include/spdk/barrier.h 00:02:05.390 TEST_HEADER include/spdk/base64.h 00:02:05.390 TEST_HEADER include/spdk/bdev.h 00:02:05.390 TEST_HEADER include/spdk/bdev_module.h 00:02:05.390 TEST_HEADER include/spdk/bdev_zone.h 00:02:05.390 TEST_HEADER include/spdk/bit_array.h 00:02:05.390 TEST_HEADER include/spdk/bit_pool.h 00:02:05.390 TEST_HEADER include/spdk/blob_bdev.h 00:02:05.390 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:05.390 TEST_HEADER include/spdk/blobfs.h 00:02:05.390 TEST_HEADER include/spdk/blob.h 00:02:05.390 TEST_HEADER include/spdk/conf.h 00:02:05.390 TEST_HEADER include/spdk/config.h 00:02:05.390 TEST_HEADER include/spdk/cpuset.h 00:02:05.390 TEST_HEADER include/spdk/crc16.h 00:02:05.390 TEST_HEADER include/spdk/crc32.h 00:02:05.391 TEST_HEADER include/spdk/crc64.h 00:02:05.391 CC app/iscsi_tgt/iscsi_tgt.o 00:02:05.391 TEST_HEADER include/spdk/dif.h 00:02:05.391 TEST_HEADER include/spdk/dma.h 00:02:05.391 TEST_HEADER include/spdk/endian.h 00:02:05.391 CC app/spdk_dd/spdk_dd.o 00:02:05.391 TEST_HEADER include/spdk/env_dpdk.h 00:02:05.391 CC app/vhost/vhost.o 00:02:05.391 TEST_HEADER include/spdk/env.h 00:02:05.391 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:05.391 TEST_HEADER include/spdk/event.h 00:02:05.391 CC app/nvmf_tgt/nvmf_main.o 00:02:05.391 TEST_HEADER include/spdk/fd_group.h 00:02:05.391 TEST_HEADER include/spdk/fd.h 00:02:05.391 TEST_HEADER include/spdk/file.h 00:02:05.391 CC examples/sock/hello_world/hello_sock.o 00:02:05.391 TEST_HEADER include/spdk/ftl.h 00:02:05.391 TEST_HEADER include/spdk/gpt_spec.h 00:02:05.391 CC app/fio/nvme/fio_plugin.o 00:02:05.391 CC app/spdk_tgt/spdk_tgt.o 00:02:05.391 TEST_HEADER include/spdk/hexlify.h 00:02:05.391 TEST_HEADER include/spdk/histogram_data.h 00:02:05.391 TEST_HEADER include/spdk/idxd.h 00:02:05.391 TEST_HEADER include/spdk/idxd_spec.h 00:02:05.391 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:05.391 TEST_HEADER include/spdk/init.h 00:02:05.391 CC examples/nvme/arbitration/arbitration.o 00:02:05.391 CC examples/vmd/led/led.o 00:02:05.391 TEST_HEADER include/spdk/ioat.h 00:02:05.391 CC examples/accel/perf/accel_perf.o 00:02:05.391 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:05.391 TEST_HEADER include/spdk/ioat_spec.h 00:02:05.391 CC examples/vmd/lsvmd/lsvmd.o 00:02:05.391 CC examples/nvme/abort/abort.o 00:02:05.391 CC test/app/jsoncat/jsoncat.o 00:02:05.391 TEST_HEADER include/spdk/iscsi_spec.h 00:02:05.391 CC examples/nvme/hotplug/hotplug.o 00:02:05.391 CC test/app/stub/stub.o 00:02:05.391 CC examples/util/zipf/zipf.o 00:02:05.391 CC examples/nvme/hello_world/hello_world.o 00:02:05.391 TEST_HEADER include/spdk/json.h 00:02:05.391 CC examples/ioat/perf/perf.o 00:02:05.391 CC test/app/histogram_perf/histogram_perf.o 00:02:05.391 TEST_HEADER include/spdk/jsonrpc.h 00:02:05.391 CC examples/nvme/reconnect/reconnect.o 00:02:05.391 CC examples/idxd/perf/perf.o 00:02:05.391 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:05.391 TEST_HEADER include/spdk/keyring.h 00:02:05.391 TEST_HEADER include/spdk/keyring_module.h 00:02:05.391 CC examples/blob/cli/blobcli.o 00:02:05.391 CC examples/ioat/verify/verify.o 00:02:05.391 TEST_HEADER include/spdk/likely.h 00:02:05.391 CC test/nvme/startup/startup.o 00:02:05.391 CC test/env/pci/pci_ut.o 00:02:05.391 CC test/event/reactor_perf/reactor_perf.o 00:02:05.391 TEST_HEADER include/spdk/log.h 00:02:05.391 CC examples/bdev/hello_world/hello_bdev.o 00:02:05.391 CC app/fio/bdev/fio_plugin.o 00:02:05.391 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:05.391 CC test/thread/lock/spdk_lock.o 00:02:05.391 TEST_HEADER include/spdk/lvol.h 00:02:05.391 CC test/event/reactor/reactor.o 00:02:05.391 CC test/nvme/sgl/sgl.o 00:02:05.391 CC test/thread/poller_perf/poller_perf.o 00:02:05.391 CC test/nvme/reset/reset.o 00:02:05.391 CC test/nvme/aer/aer.o 00:02:05.391 TEST_HEADER include/spdk/memory.h 00:02:05.391 CC test/nvme/boot_partition/boot_partition.o 00:02:05.391 CC test/nvme/err_injection/err_injection.o 00:02:05.391 CC test/event/event_perf/event_perf.o 00:02:05.391 CC test/nvme/overhead/overhead.o 00:02:05.391 TEST_HEADER include/spdk/mmio.h 00:02:05.391 CC test/nvme/reserve/reserve.o 00:02:05.391 CC test/env/memory/memory_ut.o 00:02:05.391 CC test/env/vtophys/vtophys.o 00:02:05.391 TEST_HEADER include/spdk/nbd.h 00:02:05.391 CC test/nvme/connect_stress/connect_stress.o 00:02:05.391 CC test/nvme/simple_copy/simple_copy.o 00:02:05.391 CC examples/blob/hello_world/hello_blob.o 00:02:05.391 TEST_HEADER include/spdk/notify.h 00:02:05.391 CC test/nvme/e2edp/nvme_dp.o 00:02:05.391 CC examples/bdev/bdevperf/bdevperf.o 00:02:05.391 TEST_HEADER include/spdk/nvme.h 00:02:05.391 TEST_HEADER include/spdk/nvme_intel.h 00:02:05.391 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:05.656 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:05.656 LINK spdk_lspci 00:02:05.656 TEST_HEADER include/spdk/nvme_spec.h 00:02:05.656 CC test/event/app_repeat/app_repeat.o 00:02:05.656 TEST_HEADER include/spdk/nvme_zns.h 00:02:05.656 CC examples/nvmf/nvmf/nvmf.o 00:02:05.656 CC test/blobfs/mkfs/mkfs.o 00:02:05.656 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:05.656 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:05.656 CC examples/thread/thread/thread_ex.o 00:02:05.656 CC test/accel/dif/dif.o 00:02:05.656 TEST_HEADER include/spdk/nvmf.h 00:02:05.656 CC test/dma/test_dma/test_dma.o 00:02:05.656 TEST_HEADER include/spdk/nvmf_spec.h 00:02:05.656 TEST_HEADER include/spdk/nvmf_transport.h 00:02:05.656 TEST_HEADER include/spdk/opal.h 00:02:05.656 CC test/app/bdev_svc/bdev_svc.o 00:02:05.656 TEST_HEADER include/spdk/opal_spec.h 00:02:05.656 CC test/bdev/bdevio/bdevio.o 00:02:05.656 TEST_HEADER include/spdk/pci_ids.h 00:02:05.656 TEST_HEADER include/spdk/pipe.h 00:02:05.656 TEST_HEADER include/spdk/queue.h 00:02:05.656 TEST_HEADER include/spdk/reduce.h 00:02:05.656 CC test/event/scheduler/scheduler.o 00:02:05.656 TEST_HEADER include/spdk/rpc.h 00:02:05.656 TEST_HEADER include/spdk/scheduler.h 00:02:05.656 TEST_HEADER include/spdk/scsi.h 00:02:05.656 TEST_HEADER include/spdk/scsi_spec.h 00:02:05.656 TEST_HEADER include/spdk/sock.h 00:02:05.656 TEST_HEADER include/spdk/stdinc.h 00:02:05.656 TEST_HEADER include/spdk/string.h 00:02:05.656 TEST_HEADER include/spdk/thread.h 00:02:05.656 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:05.656 CC test/lvol/esnap/esnap.o 00:02:05.656 TEST_HEADER include/spdk/trace.h 00:02:05.656 LINK rpc_client_test 00:02:05.656 TEST_HEADER include/spdk/trace_parser.h 00:02:05.656 TEST_HEADER include/spdk/tree.h 00:02:05.656 TEST_HEADER include/spdk/ublk.h 00:02:05.656 LINK spdk_nvme_discover 00:02:05.656 CC test/env/mem_callbacks/mem_callbacks.o 00:02:05.656 TEST_HEADER include/spdk/util.h 00:02:05.656 TEST_HEADER include/spdk/uuid.h 00:02:05.656 TEST_HEADER include/spdk/version.h 00:02:05.656 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:05.656 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:05.656 TEST_HEADER include/spdk/vhost.h 00:02:05.656 TEST_HEADER include/spdk/vmd.h 00:02:05.656 TEST_HEADER include/spdk/xor.h 00:02:05.656 TEST_HEADER include/spdk/zipf.h 00:02:05.656 CXX test/cpp_headers/accel.o 00:02:05.656 LINK spdk_trace_record 00:02:05.656 LINK lsvmd 00:02:05.656 LINK led 00:02:05.656 LINK jsoncat 00:02:05.656 LINK vhost 00:02:05.656 LINK interrupt_tgt 00:02:05.656 LINK nvmf_tgt 00:02:05.656 LINK zipf 00:02:05.656 LINK histogram_perf 00:02:05.656 LINK reactor 00:02:05.656 LINK reactor_perf 00:02:05.656 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:05.656 struct spdk_nvme_fdp_ruhs ruhs; 00:02:05.656 ^ 00:02:05.656 LINK poller_perf 00:02:05.656 LINK iscsi_tgt 00:02:05.656 LINK vtophys 00:02:05.656 LINK event_perf 00:02:05.656 LINK env_dpdk_post_init 00:02:05.656 LINK stub 00:02:05.656 LINK startup 00:02:05.656 LINK pmr_persistence 00:02:05.656 LINK app_repeat 00:02:05.656 LINK cmb_copy 00:02:05.656 LINK spdk_tgt 00:02:05.656 LINK boot_partition 00:02:05.656 LINK connect_stress 00:02:05.656 LINK err_injection 00:02:05.656 LINK verify 00:02:05.656 LINK ioat_perf 00:02:05.922 LINK hello_sock 00:02:05.922 LINK reserve 00:02:05.922 LINK hello_world 00:02:05.922 LINK hotplug 00:02:05.922 LINK bdev_svc 00:02:05.922 LINK mkfs 00:02:05.922 LINK simple_copy 00:02:05.922 LINK hello_bdev 00:02:05.922 LINK hello_blob 00:02:05.922 LINK thread 00:02:05.922 LINK reset 00:02:05.922 LINK sgl 00:02:05.922 LINK aer 00:02:05.922 CXX test/cpp_headers/accel_module.o 00:02:05.922 LINK nvme_dp 00:02:05.922 LINK spdk_trace 00:02:05.922 LINK scheduler 00:02:05.922 LINK overhead 00:02:05.922 LINK nvmf 00:02:05.922 LINK idxd_perf 00:02:05.922 LINK arbitration 00:02:05.922 LINK abort 00:02:05.922 LINK reconnect 00:02:06.181 LINK spdk_dd 00:02:06.181 LINK test_dma 00:02:06.181 LINK accel_perf 00:02:06.181 LINK pci_ut 00:02:06.181 LINK bdevio 00:02:06.181 LINK nvme_manage 00:02:06.181 CXX test/cpp_headers/assert.o 00:02:06.181 LINK dif 00:02:06.181 LINK nvme_fuzz 00:02:06.181 LINK blobcli 00:02:06.181 1 warning generated. 00:02:06.181 LINK spdk_bdev 00:02:06.446 LINK mem_callbacks 00:02:06.446 LINK spdk_nvme 00:02:06.446 CXX test/cpp_headers/barrier.o 00:02:06.446 CXX test/cpp_headers/base64.o 00:02:06.446 CXX test/cpp_headers/bdev.o 00:02:06.446 CXX test/cpp_headers/bdev_module.o 00:02:06.446 LINK spdk_nvme_identify 00:02:06.446 LINK spdk_nvme_perf 00:02:06.708 LINK spdk_top 00:02:06.708 LINK memory_ut 00:02:06.708 CXX test/cpp_headers/bdev_zone.o 00:02:06.708 LINK bdevperf 00:02:06.708 CC test/nvme/compliance/nvme_compliance.o 00:02:06.708 CC test/nvme/fused_ordering/fused_ordering.o 00:02:06.708 CXX test/cpp_headers/bit_array.o 00:02:06.708 CXX test/cpp_headers/bit_pool.o 00:02:06.708 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:06.972 CC test/nvme/fdp/fdp.o 00:02:06.972 CXX test/cpp_headers/blob_bdev.o 00:02:06.972 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:06.972 CXX test/cpp_headers/blobfs_bdev.o 00:02:06.972 CXX test/cpp_headers/blobfs.o 00:02:06.972 CXX test/cpp_headers/blob.o 00:02:06.972 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:06.972 CXX test/cpp_headers/conf.o 00:02:06.972 CC test/nvme/cuse/cuse.o 00:02:06.972 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:06.972 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:06.972 CXX test/cpp_headers/config.o 00:02:06.972 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:06.972 CXX test/cpp_headers/cpuset.o 00:02:06.972 CXX test/cpp_headers/crc16.o 00:02:06.972 CXX test/cpp_headers/crc32.o 00:02:06.972 LINK fused_ordering 00:02:06.972 CXX test/cpp_headers/crc64.o 00:02:06.972 CXX test/cpp_headers/dif.o 00:02:07.236 CXX test/cpp_headers/dma.o 00:02:07.236 LINK doorbell_aers 00:02:07.236 CXX test/cpp_headers/endian.o 00:02:07.236 CXX test/cpp_headers/env_dpdk.o 00:02:07.236 CXX test/cpp_headers/env.o 00:02:07.236 CXX test/cpp_headers/event.o 00:02:07.236 CXX test/cpp_headers/fd_group.o 00:02:07.236 CXX test/cpp_headers/fd.o 00:02:07.236 CXX test/cpp_headers/file.o 00:02:07.236 LINK fdp 00:02:07.236 CXX test/cpp_headers/ftl.o 00:02:07.236 CXX test/cpp_headers/gpt_spec.o 00:02:07.236 CXX test/cpp_headers/hexlify.o 00:02:07.236 CXX test/cpp_headers/histogram_data.o 00:02:07.236 CXX test/cpp_headers/idxd.o 00:02:07.236 CXX test/cpp_headers/idxd_spec.o 00:02:07.236 CXX test/cpp_headers/init.o 00:02:07.236 CXX test/cpp_headers/ioat.o 00:02:07.236 CXX test/cpp_headers/ioat_spec.o 00:02:07.236 CXX test/cpp_headers/iscsi_spec.o 00:02:07.236 CXX test/cpp_headers/json.o 00:02:07.236 CXX test/cpp_headers/jsonrpc.o 00:02:07.236 CXX test/cpp_headers/keyring.o 00:02:07.236 CXX test/cpp_headers/keyring_module.o 00:02:07.236 CXX test/cpp_headers/likely.o 00:02:07.503 CXX test/cpp_headers/log.o 00:02:07.504 CXX test/cpp_headers/lvol.o 00:02:07.504 CXX test/cpp_headers/memory.o 00:02:07.504 CXX test/cpp_headers/mmio.o 00:02:07.504 CXX test/cpp_headers/nbd.o 00:02:07.504 CXX test/cpp_headers/notify.o 00:02:07.504 CXX test/cpp_headers/nvme.o 00:02:07.504 CXX test/cpp_headers/nvme_intel.o 00:02:07.504 CXX test/cpp_headers/nvme_ocssd.o 00:02:07.504 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:07.504 LINK nvme_compliance 00:02:07.504 CXX test/cpp_headers/nvme_spec.o 00:02:07.504 CXX test/cpp_headers/nvme_zns.o 00:02:07.504 LINK llvm_vfio_fuzz 00:02:07.504 CXX test/cpp_headers/nvmf_cmd.o 00:02:07.504 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:07.504 CXX test/cpp_headers/nvmf.o 00:02:07.504 CXX test/cpp_headers/nvmf_spec.o 00:02:07.504 CXX test/cpp_headers/nvmf_transport.o 00:02:07.504 CXX test/cpp_headers/opal.o 00:02:07.504 CXX test/cpp_headers/opal_spec.o 00:02:07.504 CXX test/cpp_headers/pci_ids.o 00:02:07.504 CXX test/cpp_headers/pipe.o 00:02:07.504 CXX test/cpp_headers/queue.o 00:02:07.504 CXX test/cpp_headers/reduce.o 00:02:07.504 CXX test/cpp_headers/rpc.o 00:02:07.504 CXX test/cpp_headers/scheduler.o 00:02:07.504 CXX test/cpp_headers/scsi.o 00:02:07.504 CXX test/cpp_headers/scsi_spec.o 00:02:07.504 CXX test/cpp_headers/sock.o 00:02:07.504 CXX test/cpp_headers/stdinc.o 00:02:07.504 CXX test/cpp_headers/string.o 00:02:07.504 CXX test/cpp_headers/thread.o 00:02:07.504 CXX test/cpp_headers/trace.o 00:02:07.504 CXX test/cpp_headers/trace_parser.o 00:02:07.504 CXX test/cpp_headers/tree.o 00:02:07.504 CXX test/cpp_headers/ublk.o 00:02:07.504 CXX test/cpp_headers/util.o 00:02:07.504 CXX test/cpp_headers/uuid.o 00:02:07.504 CXX test/cpp_headers/version.o 00:02:07.504 CXX test/cpp_headers/vfio_user_pci.o 00:02:07.504 CXX test/cpp_headers/vfio_user_spec.o 00:02:07.504 CXX test/cpp_headers/vhost.o 00:02:07.504 CXX test/cpp_headers/vmd.o 00:02:07.504 CXX test/cpp_headers/xor.o 00:02:07.762 CXX test/cpp_headers/zipf.o 00:02:07.762 LINK vhost_fuzz 00:02:07.762 LINK llvm_nvme_fuzz 00:02:08.020 LINK spdk_lock 00:02:08.020 LINK cuse 00:02:08.586 LINK iscsi_fuzz 00:02:10.486 LINK esnap 00:02:10.745 00:02:10.745 real 0m45.537s 00:02:10.745 user 7m11.854s 00:02:10.745 sys 2m34.755s 00:02:10.745 11:38:01 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:02:10.745 11:38:01 -- common/autotest_common.sh@10 -- $ set +x 00:02:10.745 ************************************ 00:02:10.745 END TEST make 00:02:10.745 ************************************ 00:02:10.745 11:38:01 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:10.745 11:38:01 -- pm/common@30 -- $ signal_monitor_resources TERM 00:02:10.745 11:38:01 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:02:10.745 11:38:01 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:10.745 11:38:01 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:10.745 11:38:01 -- pm/common@45 -- $ pid=228888 00:02:10.745 11:38:01 -- pm/common@52 -- $ sudo kill -TERM 228888 00:02:11.004 11:38:01 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.004 11:38:01 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:11.004 11:38:01 -- pm/common@45 -- $ pid=228885 00:02:11.004 11:38:01 -- pm/common@52 -- $ sudo kill -TERM 228885 00:02:11.004 11:38:01 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.004 11:38:01 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:11.004 11:38:01 -- pm/common@45 -- $ pid=228895 00:02:11.004 11:38:01 -- pm/common@52 -- $ sudo kill -TERM 228895 00:02:11.004 11:38:01 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.004 11:38:01 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:11.004 11:38:01 -- pm/common@45 -- $ pid=228891 00:02:11.004 11:38:01 -- pm/common@52 -- $ sudo kill -TERM 228891 00:02:11.004 11:38:01 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:11.004 11:38:01 -- nvmf/common.sh@7 -- # uname -s 00:02:11.004 11:38:01 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:11.004 11:38:01 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:11.004 11:38:01 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:11.004 11:38:01 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:11.004 11:38:01 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:11.004 11:38:01 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:11.004 11:38:01 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:11.004 11:38:01 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:11.004 11:38:01 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:11.004 11:38:01 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:11.004 11:38:01 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:02:11.004 11:38:01 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:02:11.004 11:38:01 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:11.004 11:38:01 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:11.004 11:38:01 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:11.004 11:38:01 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:11.004 11:38:01 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:11.004 11:38:01 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:11.004 11:38:01 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:11.004 11:38:01 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:11.004 11:38:01 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.005 11:38:01 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.005 11:38:01 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.263 11:38:01 -- paths/export.sh@5 -- # export PATH 00:02:11.263 11:38:01 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.263 11:38:01 -- nvmf/common.sh@47 -- # : 0 00:02:11.263 11:38:01 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:11.263 11:38:01 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:11.263 11:38:01 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:11.263 11:38:01 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:11.263 11:38:01 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:11.263 11:38:01 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:11.263 11:38:01 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:11.263 11:38:01 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:11.263 11:38:01 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:11.263 11:38:01 -- spdk/autotest.sh@32 -- # uname -s 00:02:11.263 11:38:01 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:11.263 11:38:01 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:11.263 11:38:01 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:11.263 11:38:01 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:11.263 11:38:01 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:11.263 11:38:01 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:11.263 11:38:01 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:11.263 11:38:01 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:11.263 11:38:01 -- spdk/autotest.sh@48 -- # udevadm_pid=286175 00:02:11.263 11:38:01 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:11.263 11:38:01 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:11.263 11:38:01 -- pm/common@17 -- # local monitor 00:02:11.263 11:38:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.263 11:38:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=286177 00:02:11.263 11:38:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.263 11:38:01 -- pm/common@21 -- # date +%s 00:02:11.263 11:38:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=286179 00:02:11.263 11:38:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.263 11:38:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=286183 00:02:11.263 11:38:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.263 11:38:01 -- pm/common@21 -- # date +%s 00:02:11.263 11:38:01 -- pm/common@21 -- # date +%s 00:02:11.263 11:38:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=286186 00:02:11.263 11:38:01 -- pm/common@26 -- # sleep 1 00:02:11.263 11:38:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713433081 00:02:11.263 11:38:01 -- pm/common@21 -- # date +%s 00:02:11.263 11:38:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713433081 00:02:11.263 11:38:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713433081 00:02:11.263 11:38:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713433081 00:02:11.263 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713433081_collect-cpu-load.pm.log 00:02:11.263 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713433081_collect-cpu-temp.pm.log 00:02:11.263 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713433081_collect-vmstat.pm.log 00:02:11.263 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713433081_collect-bmc-pm.bmc.pm.log 00:02:12.199 11:38:02 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:12.199 11:38:02 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:12.199 11:38:02 -- common/autotest_common.sh@710 -- # xtrace_disable 00:02:12.199 11:38:02 -- common/autotest_common.sh@10 -- # set +x 00:02:12.199 11:38:02 -- spdk/autotest.sh@59 -- # create_test_list 00:02:12.199 11:38:02 -- common/autotest_common.sh@734 -- # xtrace_disable 00:02:12.199 11:38:02 -- common/autotest_common.sh@10 -- # set +x 00:02:12.199 11:38:02 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:12.199 11:38:02 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:12.199 11:38:02 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:12.199 11:38:02 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:12.199 11:38:02 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:12.199 11:38:02 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:12.199 11:38:02 -- common/autotest_common.sh@1441 -- # uname 00:02:12.199 11:38:02 -- common/autotest_common.sh@1441 -- # '[' Linux = FreeBSD ']' 00:02:12.199 11:38:02 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:12.199 11:38:02 -- common/autotest_common.sh@1461 -- # uname 00:02:12.199 11:38:02 -- common/autotest_common.sh@1461 -- # [[ Linux = FreeBSD ]] 00:02:12.199 11:38:02 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:12.199 11:38:02 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=clang 00:02:12.199 11:38:02 -- spdk/autotest.sh@72 -- # hash lcov 00:02:12.199 11:38:02 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:12.199 11:38:02 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:12.199 11:38:02 -- common/autotest_common.sh@710 -- # xtrace_disable 00:02:12.199 11:38:02 -- common/autotest_common.sh@10 -- # set +x 00:02:12.199 11:38:02 -- spdk/autotest.sh@91 -- # rm -f 00:02:12.199 11:38:02 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:16.389 0000:1a:00.0 (8086 0a54): Already using the nvme driver 00:02:16.389 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:16.389 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:18.295 11:38:08 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:02:18.295 11:38:08 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:02:18.295 11:38:08 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:02:18.295 11:38:08 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:02:18.295 11:38:08 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:18.295 11:38:08 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:02:18.295 11:38:08 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:02:18.295 11:38:08 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:18.295 11:38:08 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:18.295 11:38:08 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:02:18.295 11:38:08 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:18.295 11:38:08 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:18.295 11:38:08 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:02:18.295 11:38:08 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:02:18.295 11:38:08 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:18.295 No valid GPT data, bailing 00:02:18.295 11:38:08 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:18.295 11:38:08 -- scripts/common.sh@391 -- # pt= 00:02:18.295 11:38:08 -- scripts/common.sh@392 -- # return 1 00:02:18.295 11:38:08 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:18.295 1+0 records in 00:02:18.295 1+0 records out 00:02:18.296 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00707275 s, 148 MB/s 00:02:18.296 11:38:08 -- spdk/autotest.sh@118 -- # sync 00:02:18.296 11:38:08 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:18.296 11:38:08 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:18.296 11:38:08 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:23.568 11:38:13 -- spdk/autotest.sh@124 -- # uname -s 00:02:23.568 11:38:13 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:02:23.568 11:38:13 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:23.568 11:38:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:23.568 11:38:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:23.568 11:38:13 -- common/autotest_common.sh@10 -- # set +x 00:02:23.568 ************************************ 00:02:23.568 START TEST setup.sh 00:02:23.568 ************************************ 00:02:23.568 11:38:13 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:23.568 * Looking for test storage... 00:02:23.568 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:23.568 11:38:13 -- setup/test-setup.sh@10 -- # uname -s 00:02:23.568 11:38:13 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:23.568 11:38:13 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:23.568 11:38:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:23.568 11:38:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:23.568 11:38:13 -- common/autotest_common.sh@10 -- # set +x 00:02:23.828 ************************************ 00:02:23.828 START TEST acl 00:02:23.828 ************************************ 00:02:23.828 11:38:14 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:23.828 * Looking for test storage... 00:02:23.828 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:23.828 11:38:14 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:23.828 11:38:14 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:02:23.828 11:38:14 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:02:23.828 11:38:14 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:02:23.828 11:38:14 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:23.828 11:38:14 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:02:23.828 11:38:14 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:02:23.828 11:38:14 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:23.828 11:38:14 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:23.828 11:38:14 -- setup/acl.sh@12 -- # devs=() 00:02:23.828 11:38:14 -- setup/acl.sh@12 -- # declare -a devs 00:02:23.828 11:38:14 -- setup/acl.sh@13 -- # drivers=() 00:02:23.828 11:38:14 -- setup/acl.sh@13 -- # declare -A drivers 00:02:23.828 11:38:14 -- setup/acl.sh@51 -- # setup reset 00:02:23.828 11:38:14 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:23.828 11:38:14 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:30.399 11:38:20 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:30.399 11:38:20 -- setup/acl.sh@16 -- # local dev driver 00:02:30.399 11:38:20 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:30.399 11:38:20 -- setup/acl.sh@15 -- # setup output status 00:02:30.399 11:38:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:30.399 11:38:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:33.771 Hugepages 00:02:33.771 node hugesize free / total 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 00:02:33.772 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:1a:00.0 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:33.772 11:38:23 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:33.772 11:38:23 -- setup/acl.sh@20 -- # continue 00:02:33.772 11:38:23 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:33.772 11:38:23 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:33.772 11:38:23 -- setup/acl.sh@54 -- # run_test denied denied 00:02:33.772 11:38:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:33.772 11:38:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:33.772 11:38:23 -- common/autotest_common.sh@10 -- # set +x 00:02:33.772 ************************************ 00:02:33.772 START TEST denied 00:02:33.772 ************************************ 00:02:33.772 11:38:23 -- common/autotest_common.sh@1111 -- # denied 00:02:33.772 11:38:23 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:1a:00.0' 00:02:33.772 11:38:23 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:1a:00.0' 00:02:33.772 11:38:23 -- setup/acl.sh@38 -- # setup output config 00:02:33.772 11:38:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:33.772 11:38:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:40.343 0000:1a:00.0 (8086 0a54): Skipping denied controller at 0000:1a:00.0 00:02:40.344 11:38:29 -- setup/acl.sh@40 -- # verify 0000:1a:00.0 00:02:40.344 11:38:29 -- setup/acl.sh@28 -- # local dev driver 00:02:40.344 11:38:29 -- setup/acl.sh@30 -- # for dev in "$@" 00:02:40.344 11:38:29 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:1a:00.0 ]] 00:02:40.344 11:38:29 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:1a:00.0/driver 00:02:40.344 11:38:29 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:40.344 11:38:29 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:40.344 11:38:29 -- setup/acl.sh@41 -- # setup reset 00:02:40.344 11:38:29 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:40.344 11:38:29 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:46.911 00:02:46.911 real 0m12.427s 00:02:46.911 user 0m3.987s 00:02:46.911 sys 0m7.725s 00:02:46.911 11:38:36 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:02:46.911 11:38:36 -- common/autotest_common.sh@10 -- # set +x 00:02:46.911 ************************************ 00:02:46.911 END TEST denied 00:02:46.911 ************************************ 00:02:46.911 11:38:36 -- setup/acl.sh@55 -- # run_test allowed allowed 00:02:46.911 11:38:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:46.911 11:38:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:46.911 11:38:36 -- common/autotest_common.sh@10 -- # set +x 00:02:46.911 ************************************ 00:02:46.911 START TEST allowed 00:02:46.911 ************************************ 00:02:46.911 11:38:36 -- common/autotest_common.sh@1111 -- # allowed 00:02:46.911 11:38:36 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:1a:00.0 00:02:46.911 11:38:36 -- setup/acl.sh@46 -- # grep -E '0000:1a:00.0 .*: nvme -> .*' 00:02:46.911 11:38:36 -- setup/acl.sh@45 -- # setup output config 00:02:46.911 11:38:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:46.911 11:38:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:55.029 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:02:55.029 11:38:45 -- setup/acl.sh@47 -- # verify 00:02:55.029 11:38:45 -- setup/acl.sh@28 -- # local dev driver 00:02:55.029 11:38:45 -- setup/acl.sh@48 -- # setup reset 00:02:55.029 11:38:45 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:55.029 11:38:45 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:01.606 00:03:01.606 real 0m14.813s 00:03:01.606 user 0m4.087s 00:03:01.606 sys 0m7.573s 00:03:01.606 11:38:51 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:01.606 11:38:51 -- common/autotest_common.sh@10 -- # set +x 00:03:01.606 ************************************ 00:03:01.606 END TEST allowed 00:03:01.606 ************************************ 00:03:01.606 00:03:01.607 real 0m37.288s 00:03:01.607 user 0m11.449s 00:03:01.607 sys 0m22.114s 00:03:01.607 11:38:51 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:01.607 11:38:51 -- common/autotest_common.sh@10 -- # set +x 00:03:01.607 ************************************ 00:03:01.607 END TEST acl 00:03:01.607 ************************************ 00:03:01.607 11:38:51 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:01.607 11:38:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:01.607 11:38:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:01.607 11:38:51 -- common/autotest_common.sh@10 -- # set +x 00:03:01.607 ************************************ 00:03:01.607 START TEST hugepages 00:03:01.607 ************************************ 00:03:01.607 11:38:51 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:01.607 * Looking for test storage... 00:03:01.607 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:01.607 11:38:51 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:01.607 11:38:51 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:01.607 11:38:51 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:01.607 11:38:51 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:01.607 11:38:51 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:01.607 11:38:51 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:01.607 11:38:51 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:01.607 11:38:51 -- setup/common.sh@18 -- # local node= 00:03:01.607 11:38:51 -- setup/common.sh@19 -- # local var val 00:03:01.607 11:38:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:01.607 11:38:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:01.607 11:38:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:01.607 11:38:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:01.607 11:38:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:01.607 11:38:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 72515236 kB' 'MemAvailable: 76238628 kB' 'Buffers: 9664 kB' 'Cached: 12374408 kB' 'SwapCached: 0 kB' 'Active: 9387244 kB' 'Inactive: 3552388 kB' 'Active(anon): 8664336 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 559388 kB' 'Mapped: 198352 kB' 'Shmem: 8108776 kB' 'KReclaimable: 203436 kB' 'Slab: 582556 kB' 'SReclaimable: 203436 kB' 'SUnreclaim: 379120 kB' 'KernelStack: 16272 kB' 'PageTables: 8672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438220 kB' 'Committed_AS: 10011480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210676 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.607 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.607 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # continue 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:01.608 11:38:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:01.608 11:38:51 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:01.608 11:38:51 -- setup/common.sh@33 -- # echo 2048 00:03:01.608 11:38:51 -- setup/common.sh@33 -- # return 0 00:03:01.608 11:38:51 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:01.608 11:38:51 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:01.608 11:38:51 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:01.608 11:38:51 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:01.608 11:38:51 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:01.608 11:38:51 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:01.608 11:38:51 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:01.608 11:38:51 -- setup/hugepages.sh@207 -- # get_nodes 00:03:01.608 11:38:51 -- setup/hugepages.sh@27 -- # local node 00:03:01.608 11:38:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:01.608 11:38:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:01.608 11:38:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:01.608 11:38:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:01.608 11:38:51 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:01.608 11:38:51 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:01.608 11:38:51 -- setup/hugepages.sh@208 -- # clear_hp 00:03:01.608 11:38:51 -- setup/hugepages.sh@37 -- # local node hp 00:03:01.608 11:38:51 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:01.608 11:38:51 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:01.608 11:38:51 -- setup/hugepages.sh@41 -- # echo 0 00:03:01.608 11:38:51 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:01.608 11:38:51 -- setup/hugepages.sh@41 -- # echo 0 00:03:01.608 11:38:51 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:01.608 11:38:51 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:01.608 11:38:51 -- setup/hugepages.sh@41 -- # echo 0 00:03:01.608 11:38:51 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:01.608 11:38:51 -- setup/hugepages.sh@41 -- # echo 0 00:03:01.608 11:38:51 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:01.608 11:38:51 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:01.608 11:38:51 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:01.608 11:38:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:01.608 11:38:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:01.608 11:38:51 -- common/autotest_common.sh@10 -- # set +x 00:03:01.608 ************************************ 00:03:01.609 START TEST default_setup 00:03:01.609 ************************************ 00:03:01.609 11:38:51 -- common/autotest_common.sh@1111 -- # default_setup 00:03:01.609 11:38:51 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:01.609 11:38:51 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:01.609 11:38:51 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:01.609 11:38:51 -- setup/hugepages.sh@51 -- # shift 00:03:01.609 11:38:51 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:01.609 11:38:51 -- setup/hugepages.sh@52 -- # local node_ids 00:03:01.609 11:38:51 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:01.609 11:38:51 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:01.609 11:38:51 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:01.609 11:38:51 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:01.609 11:38:51 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:01.609 11:38:51 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:01.609 11:38:51 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:01.609 11:38:51 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:01.609 11:38:51 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:01.609 11:38:52 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:01.609 11:38:52 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:01.609 11:38:52 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:01.609 11:38:52 -- setup/hugepages.sh@73 -- # return 0 00:03:01.609 11:38:52 -- setup/hugepages.sh@137 -- # setup output 00:03:01.609 11:38:52 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:01.609 11:38:52 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:05.812 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:05.812 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:09.104 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:03:11.012 11:39:01 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:11.012 11:39:01 -- setup/hugepages.sh@89 -- # local node 00:03:11.012 11:39:01 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:11.012 11:39:01 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:11.012 11:39:01 -- setup/hugepages.sh@92 -- # local surp 00:03:11.012 11:39:01 -- setup/hugepages.sh@93 -- # local resv 00:03:11.012 11:39:01 -- setup/hugepages.sh@94 -- # local anon 00:03:11.012 11:39:01 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:11.012 11:39:01 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:11.012 11:39:01 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:11.012 11:39:01 -- setup/common.sh@18 -- # local node= 00:03:11.012 11:39:01 -- setup/common.sh@19 -- # local var val 00:03:11.012 11:39:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.012 11:39:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.012 11:39:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.012 11:39:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.012 11:39:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.012 11:39:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.012 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.012 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.012 11:39:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74728804 kB' 'MemAvailable: 78451988 kB' 'Buffers: 9664 kB' 'Cached: 12374568 kB' 'SwapCached: 0 kB' 'Active: 9405000 kB' 'Inactive: 3552388 kB' 'Active(anon): 8682092 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576160 kB' 'Mapped: 199080 kB' 'Shmem: 8108936 kB' 'KReclaimable: 203020 kB' 'Slab: 580236 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377216 kB' 'KernelStack: 16208 kB' 'PageTables: 9156 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10030840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210628 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:11.012 11:39:01 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.012 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.012 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.012 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.012 11:39:01 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.012 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.012 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.012 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.013 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.013 11:39:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:11.013 11:39:01 -- setup/common.sh@33 -- # echo 0 00:03:11.013 11:39:01 -- setup/common.sh@33 -- # return 0 00:03:11.013 11:39:01 -- setup/hugepages.sh@97 -- # anon=0 00:03:11.013 11:39:01 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:11.013 11:39:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:11.013 11:39:01 -- setup/common.sh@18 -- # local node= 00:03:11.013 11:39:01 -- setup/common.sh@19 -- # local var val 00:03:11.013 11:39:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.013 11:39:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.013 11:39:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.013 11:39:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.013 11:39:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.013 11:39:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74728864 kB' 'MemAvailable: 78452048 kB' 'Buffers: 9664 kB' 'Cached: 12374572 kB' 'SwapCached: 0 kB' 'Active: 9400024 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677116 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 571544 kB' 'Mapped: 198512 kB' 'Shmem: 8108940 kB' 'KReclaimable: 203020 kB' 'Slab: 580228 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377208 kB' 'KernelStack: 16208 kB' 'PageTables: 9128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10026616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210596 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.014 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.014 11:39:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.015 11:39:01 -- setup/common.sh@33 -- # echo 0 00:03:11.015 11:39:01 -- setup/common.sh@33 -- # return 0 00:03:11.015 11:39:01 -- setup/hugepages.sh@99 -- # surp=0 00:03:11.015 11:39:01 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:11.015 11:39:01 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:11.015 11:39:01 -- setup/common.sh@18 -- # local node= 00:03:11.015 11:39:01 -- setup/common.sh@19 -- # local var val 00:03:11.015 11:39:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.015 11:39:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.015 11:39:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.015 11:39:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.015 11:39:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.015 11:39:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.015 11:39:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74727608 kB' 'MemAvailable: 78450792 kB' 'Buffers: 9664 kB' 'Cached: 12374584 kB' 'SwapCached: 0 kB' 'Active: 9404360 kB' 'Inactive: 3552388 kB' 'Active(anon): 8681452 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575868 kB' 'Mapped: 198916 kB' 'Shmem: 8108952 kB' 'KReclaimable: 203020 kB' 'Slab: 580228 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377208 kB' 'KernelStack: 16192 kB' 'PageTables: 9096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10030868 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210600 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.015 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.015 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.016 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.016 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:11.016 11:39:01 -- setup/common.sh@33 -- # echo 0 00:03:11.016 11:39:01 -- setup/common.sh@33 -- # return 0 00:03:11.016 11:39:01 -- setup/hugepages.sh@100 -- # resv=0 00:03:11.016 11:39:01 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:11.016 nr_hugepages=1024 00:03:11.016 11:39:01 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:11.016 resv_hugepages=0 00:03:11.016 11:39:01 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:11.016 surplus_hugepages=0 00:03:11.016 11:39:01 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:11.016 anon_hugepages=0 00:03:11.016 11:39:01 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:11.016 11:39:01 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:11.016 11:39:01 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:11.016 11:39:01 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:11.016 11:39:01 -- setup/common.sh@18 -- # local node= 00:03:11.016 11:39:01 -- setup/common.sh@19 -- # local var val 00:03:11.016 11:39:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.016 11:39:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.016 11:39:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:11.016 11:39:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:11.016 11:39:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.016 11:39:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.017 11:39:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74731664 kB' 'MemAvailable: 78454848 kB' 'Buffers: 9664 kB' 'Cached: 12374596 kB' 'SwapCached: 0 kB' 'Active: 9400088 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677180 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570988 kB' 'Mapped: 198504 kB' 'Shmem: 8108964 kB' 'KReclaimable: 203020 kB' 'Slab: 580220 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377200 kB' 'KernelStack: 16192 kB' 'PageTables: 9068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10026384 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210612 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.017 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.017 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:11.018 11:39:01 -- setup/common.sh@33 -- # echo 1024 00:03:11.018 11:39:01 -- setup/common.sh@33 -- # return 0 00:03:11.018 11:39:01 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:11.018 11:39:01 -- setup/hugepages.sh@112 -- # get_nodes 00:03:11.018 11:39:01 -- setup/hugepages.sh@27 -- # local node 00:03:11.018 11:39:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:11.018 11:39:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:11.018 11:39:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:11.018 11:39:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:11.018 11:39:01 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:11.018 11:39:01 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:11.018 11:39:01 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:11.018 11:39:01 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:11.018 11:39:01 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:11.018 11:39:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:11.018 11:39:01 -- setup/common.sh@18 -- # local node=0 00:03:11.018 11:39:01 -- setup/common.sh@19 -- # local var val 00:03:11.018 11:39:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:11.018 11:39:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:11.018 11:39:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:11.018 11:39:01 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:11.018 11:39:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:11.018 11:39:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 40846780 kB' 'MemUsed: 7270184 kB' 'SwapCached: 0 kB' 'Active: 3188156 kB' 'Inactive: 140552 kB' 'Active(anon): 2928636 kB' 'Inactive(anon): 0 kB' 'Active(file): 259520 kB' 'Inactive(file): 140552 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3091816 kB' 'Mapped: 132976 kB' 'AnonPages: 240032 kB' 'Shmem: 2691744 kB' 'KernelStack: 8136 kB' 'PageTables: 4480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 113616 kB' 'Slab: 351520 kB' 'SReclaimable: 113616 kB' 'SUnreclaim: 237904 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.018 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.018 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # continue 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:11.019 11:39:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:11.019 11:39:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:11.019 11:39:01 -- setup/common.sh@33 -- # echo 0 00:03:11.019 11:39:01 -- setup/common.sh@33 -- # return 0 00:03:11.019 11:39:01 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:11.019 11:39:01 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:11.019 11:39:01 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:11.019 11:39:01 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:11.019 11:39:01 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:11.019 node0=1024 expecting 1024 00:03:11.019 11:39:01 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:11.019 00:03:11.019 real 0m9.259s 00:03:11.019 user 0m2.236s 00:03:11.019 sys 0m3.979s 00:03:11.019 11:39:01 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:11.019 11:39:01 -- common/autotest_common.sh@10 -- # set +x 00:03:11.019 ************************************ 00:03:11.019 END TEST default_setup 00:03:11.019 ************************************ 00:03:11.019 11:39:01 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:11.019 11:39:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:11.019 11:39:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:11.019 11:39:01 -- common/autotest_common.sh@10 -- # set +x 00:03:11.019 ************************************ 00:03:11.019 START TEST per_node_1G_alloc 00:03:11.019 ************************************ 00:03:11.019 11:39:01 -- common/autotest_common.sh@1111 -- # per_node_1G_alloc 00:03:11.019 11:39:01 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:11.019 11:39:01 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:11.019 11:39:01 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:11.019 11:39:01 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:11.019 11:39:01 -- setup/hugepages.sh@51 -- # shift 00:03:11.019 11:39:01 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:11.019 11:39:01 -- setup/hugepages.sh@52 -- # local node_ids 00:03:11.019 11:39:01 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:11.019 11:39:01 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:11.019 11:39:01 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:11.019 11:39:01 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:11.019 11:39:01 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:11.019 11:39:01 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:11.019 11:39:01 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:11.019 11:39:01 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:11.019 11:39:01 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:11.019 11:39:01 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:11.019 11:39:01 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:11.019 11:39:01 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:11.019 11:39:01 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:11.019 11:39:01 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:11.019 11:39:01 -- setup/hugepages.sh@73 -- # return 0 00:03:11.019 11:39:01 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:11.019 11:39:01 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:11.019 11:39:01 -- setup/hugepages.sh@146 -- # setup output 00:03:11.019 11:39:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:11.019 11:39:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:14.309 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:14.309 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:14.309 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:14.310 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:14.310 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:14.310 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:16.220 11:39:06 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:16.220 11:39:06 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:16.220 11:39:06 -- setup/hugepages.sh@89 -- # local node 00:03:16.220 11:39:06 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:16.220 11:39:06 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:16.220 11:39:06 -- setup/hugepages.sh@92 -- # local surp 00:03:16.220 11:39:06 -- setup/hugepages.sh@93 -- # local resv 00:03:16.220 11:39:06 -- setup/hugepages.sh@94 -- # local anon 00:03:16.220 11:39:06 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:16.220 11:39:06 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:16.220 11:39:06 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:16.220 11:39:06 -- setup/common.sh@18 -- # local node= 00:03:16.220 11:39:06 -- setup/common.sh@19 -- # local var val 00:03:16.220 11:39:06 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.220 11:39:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.220 11:39:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.220 11:39:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.220 11:39:06 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.220 11:39:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.220 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.220 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.220 11:39:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74736400 kB' 'MemAvailable: 78459584 kB' 'Buffers: 9664 kB' 'Cached: 12374708 kB' 'SwapCached: 0 kB' 'Active: 9405352 kB' 'Inactive: 3552388 kB' 'Active(anon): 8682444 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576068 kB' 'Mapped: 198168 kB' 'Shmem: 8109076 kB' 'KReclaimable: 203020 kB' 'Slab: 580572 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377552 kB' 'KernelStack: 16128 kB' 'PageTables: 8848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10020412 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210648 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:16.220 11:39:06 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.220 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.220 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.220 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.220 11:39:06 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.221 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.221 11:39:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.221 11:39:06 -- setup/common.sh@33 -- # echo 0 00:03:16.221 11:39:06 -- setup/common.sh@33 -- # return 0 00:03:16.221 11:39:06 -- setup/hugepages.sh@97 -- # anon=0 00:03:16.221 11:39:06 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:16.221 11:39:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.221 11:39:06 -- setup/common.sh@18 -- # local node= 00:03:16.221 11:39:06 -- setup/common.sh@19 -- # local var val 00:03:16.221 11:39:06 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.221 11:39:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.222 11:39:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.222 11:39:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.222 11:39:06 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.222 11:39:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74737224 kB' 'MemAvailable: 78460408 kB' 'Buffers: 9664 kB' 'Cached: 12374708 kB' 'SwapCached: 0 kB' 'Active: 9400416 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677508 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 571228 kB' 'Mapped: 197588 kB' 'Shmem: 8109076 kB' 'KReclaimable: 203020 kB' 'Slab: 580572 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377552 kB' 'KernelStack: 16176 kB' 'PageTables: 9004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10015320 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210612 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.222 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.222 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.223 11:39:06 -- setup/common.sh@33 -- # echo 0 00:03:16.223 11:39:06 -- setup/common.sh@33 -- # return 0 00:03:16.223 11:39:06 -- setup/hugepages.sh@99 -- # surp=0 00:03:16.223 11:39:06 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:16.223 11:39:06 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:16.223 11:39:06 -- setup/common.sh@18 -- # local node= 00:03:16.223 11:39:06 -- setup/common.sh@19 -- # local var val 00:03:16.223 11:39:06 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.223 11:39:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.223 11:39:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.223 11:39:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.223 11:39:06 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.223 11:39:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.223 11:39:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74736320 kB' 'MemAvailable: 78459504 kB' 'Buffers: 9664 kB' 'Cached: 12374720 kB' 'SwapCached: 0 kB' 'Active: 9401572 kB' 'Inactive: 3552388 kB' 'Active(anon): 8678664 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573256 kB' 'Mapped: 197624 kB' 'Shmem: 8109088 kB' 'KReclaimable: 203020 kB' 'Slab: 580536 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377516 kB' 'KernelStack: 16064 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10018316 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210596 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.223 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.223 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.224 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.224 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.487 11:39:06 -- setup/common.sh@33 -- # echo 0 00:03:16.487 11:39:06 -- setup/common.sh@33 -- # return 0 00:03:16.487 11:39:06 -- setup/hugepages.sh@100 -- # resv=0 00:03:16.487 11:39:06 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:16.487 nr_hugepages=1024 00:03:16.487 11:39:06 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:16.487 resv_hugepages=0 00:03:16.487 11:39:06 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:16.487 surplus_hugepages=0 00:03:16.487 11:39:06 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:16.487 anon_hugepages=0 00:03:16.487 11:39:06 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:16.487 11:39:06 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:16.487 11:39:06 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:16.487 11:39:06 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:16.487 11:39:06 -- setup/common.sh@18 -- # local node= 00:03:16.487 11:39:06 -- setup/common.sh@19 -- # local var val 00:03:16.487 11:39:06 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.487 11:39:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.487 11:39:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.487 11:39:06 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.487 11:39:06 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.487 11:39:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.487 11:39:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74735972 kB' 'MemAvailable: 78459156 kB' 'Buffers: 9664 kB' 'Cached: 12374732 kB' 'SwapCached: 0 kB' 'Active: 9403992 kB' 'Inactive: 3552388 kB' 'Active(anon): 8681084 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575192 kB' 'Mapped: 197964 kB' 'Shmem: 8109100 kB' 'KReclaimable: 203020 kB' 'Slab: 580536 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377516 kB' 'KernelStack: 16096 kB' 'PageTables: 8744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10020212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210568 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.487 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.487 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.488 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.488 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.489 11:39:06 -- setup/common.sh@33 -- # echo 1024 00:03:16.489 11:39:06 -- setup/common.sh@33 -- # return 0 00:03:16.489 11:39:06 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:16.489 11:39:06 -- setup/hugepages.sh@112 -- # get_nodes 00:03:16.489 11:39:06 -- setup/hugepages.sh@27 -- # local node 00:03:16.489 11:39:06 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.489 11:39:06 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:16.489 11:39:06 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.489 11:39:06 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:16.489 11:39:06 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:16.489 11:39:06 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:16.489 11:39:06 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:16.489 11:39:06 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:16.489 11:39:06 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:16.489 11:39:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.489 11:39:06 -- setup/common.sh@18 -- # local node=0 00:03:16.489 11:39:06 -- setup/common.sh@19 -- # local var val 00:03:16.489 11:39:06 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.489 11:39:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.489 11:39:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:16.489 11:39:06 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:16.489 11:39:06 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.489 11:39:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 41886816 kB' 'MemUsed: 6230148 kB' 'SwapCached: 0 kB' 'Active: 3189024 kB' 'Inactive: 140552 kB' 'Active(anon): 2929504 kB' 'Inactive(anon): 0 kB' 'Active(file): 259520 kB' 'Inactive(file): 140552 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3091872 kB' 'Mapped: 131540 kB' 'AnonPages: 240940 kB' 'Shmem: 2691800 kB' 'KernelStack: 8152 kB' 'PageTables: 4624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 113616 kB' 'Slab: 351632 kB' 'SReclaimable: 113616 kB' 'SUnreclaim: 238016 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.489 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.489 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@33 -- # echo 0 00:03:16.490 11:39:06 -- setup/common.sh@33 -- # return 0 00:03:16.490 11:39:06 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:16.490 11:39:06 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:16.490 11:39:06 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:16.490 11:39:06 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:16.490 11:39:06 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.490 11:39:06 -- setup/common.sh@18 -- # local node=1 00:03:16.490 11:39:06 -- setup/common.sh@19 -- # local var val 00:03:16.490 11:39:06 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.490 11:39:06 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.490 11:39:06 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:16.490 11:39:06 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:16.490 11:39:06 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.490 11:39:06 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176572 kB' 'MemFree: 32850300 kB' 'MemUsed: 11326272 kB' 'SwapCached: 0 kB' 'Active: 6210532 kB' 'Inactive: 3411836 kB' 'Active(anon): 5747144 kB' 'Inactive(anon): 0 kB' 'Active(file): 463388 kB' 'Inactive(file): 3411836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9292528 kB' 'Mapped: 65920 kB' 'AnonPages: 329968 kB' 'Shmem: 5417304 kB' 'KernelStack: 7960 kB' 'PageTables: 4176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89404 kB' 'Slab: 228892 kB' 'SReclaimable: 89404 kB' 'SUnreclaim: 139488 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.490 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.490 11:39:06 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # continue 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.491 11:39:06 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.491 11:39:06 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.491 11:39:06 -- setup/common.sh@33 -- # echo 0 00:03:16.491 11:39:06 -- setup/common.sh@33 -- # return 0 00:03:16.491 11:39:06 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:16.491 11:39:06 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:16.491 11:39:06 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:16.491 11:39:06 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:16.491 11:39:06 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:16.491 node0=512 expecting 512 00:03:16.491 11:39:06 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:16.491 11:39:06 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:16.491 11:39:06 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:16.491 11:39:06 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:16.491 node1=512 expecting 512 00:03:16.491 11:39:06 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:16.491 00:03:16.491 real 0m5.400s 00:03:16.491 user 0m1.819s 00:03:16.491 sys 0m3.571s 00:03:16.491 11:39:06 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:16.491 11:39:06 -- common/autotest_common.sh@10 -- # set +x 00:03:16.491 ************************************ 00:03:16.491 END TEST per_node_1G_alloc 00:03:16.491 ************************************ 00:03:16.491 11:39:06 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:16.491 11:39:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:16.491 11:39:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:16.491 11:39:06 -- common/autotest_common.sh@10 -- # set +x 00:03:16.751 ************************************ 00:03:16.751 START TEST even_2G_alloc 00:03:16.751 ************************************ 00:03:16.751 11:39:07 -- common/autotest_common.sh@1111 -- # even_2G_alloc 00:03:16.751 11:39:07 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:16.751 11:39:07 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:16.751 11:39:07 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:16.751 11:39:07 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:16.751 11:39:07 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:16.751 11:39:07 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:16.751 11:39:07 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:16.751 11:39:07 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:16.751 11:39:07 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:16.751 11:39:07 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:16.751 11:39:07 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:16.751 11:39:07 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:16.751 11:39:07 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:16.751 11:39:07 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:16.751 11:39:07 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:16.751 11:39:07 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:16.751 11:39:07 -- setup/hugepages.sh@83 -- # : 512 00:03:16.751 11:39:07 -- setup/hugepages.sh@84 -- # : 1 00:03:16.751 11:39:07 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:16.751 11:39:07 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:16.752 11:39:07 -- setup/hugepages.sh@83 -- # : 0 00:03:16.752 11:39:07 -- setup/hugepages.sh@84 -- # : 0 00:03:16.752 11:39:07 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:16.752 11:39:07 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:16.752 11:39:07 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:16.752 11:39:07 -- setup/hugepages.sh@153 -- # setup output 00:03:16.752 11:39:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:16.752 11:39:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:20.084 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:20.084 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:20.084 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:21.993 11:39:12 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:21.993 11:39:12 -- setup/hugepages.sh@89 -- # local node 00:03:21.993 11:39:12 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:21.993 11:39:12 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:21.993 11:39:12 -- setup/hugepages.sh@92 -- # local surp 00:03:21.993 11:39:12 -- setup/hugepages.sh@93 -- # local resv 00:03:21.993 11:39:12 -- setup/hugepages.sh@94 -- # local anon 00:03:21.993 11:39:12 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:21.993 11:39:12 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:21.993 11:39:12 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:21.993 11:39:12 -- setup/common.sh@18 -- # local node= 00:03:21.993 11:39:12 -- setup/common.sh@19 -- # local var val 00:03:21.993 11:39:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.993 11:39:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.993 11:39:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.993 11:39:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.993 11:39:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.993 11:39:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74773524 kB' 'MemAvailable: 78496708 kB' 'Buffers: 9664 kB' 'Cached: 12374868 kB' 'SwapCached: 0 kB' 'Active: 9397184 kB' 'Inactive: 3552388 kB' 'Active(anon): 8674276 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 567932 kB' 'Mapped: 197360 kB' 'Shmem: 8109236 kB' 'KReclaimable: 203020 kB' 'Slab: 580384 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377364 kB' 'KernelStack: 16048 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10011536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210628 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.993 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.993 11:39:12 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:21.994 11:39:12 -- setup/common.sh@33 -- # echo 0 00:03:21.994 11:39:12 -- setup/common.sh@33 -- # return 0 00:03:21.994 11:39:12 -- setup/hugepages.sh@97 -- # anon=0 00:03:21.994 11:39:12 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:21.994 11:39:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:21.994 11:39:12 -- setup/common.sh@18 -- # local node= 00:03:21.994 11:39:12 -- setup/common.sh@19 -- # local var val 00:03:21.994 11:39:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.994 11:39:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.994 11:39:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.994 11:39:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.994 11:39:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.994 11:39:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74774056 kB' 'MemAvailable: 78497240 kB' 'Buffers: 9664 kB' 'Cached: 12374872 kB' 'SwapCached: 0 kB' 'Active: 9396848 kB' 'Inactive: 3552388 kB' 'Active(anon): 8673940 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 568096 kB' 'Mapped: 197204 kB' 'Shmem: 8109240 kB' 'KReclaimable: 203020 kB' 'Slab: 580356 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377336 kB' 'KernelStack: 16032 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10011548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210612 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.994 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.994 11:39:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.995 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.995 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:21.996 11:39:12 -- setup/common.sh@33 -- # echo 0 00:03:21.996 11:39:12 -- setup/common.sh@33 -- # return 0 00:03:21.996 11:39:12 -- setup/hugepages.sh@99 -- # surp=0 00:03:21.996 11:39:12 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:21.996 11:39:12 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:21.996 11:39:12 -- setup/common.sh@18 -- # local node= 00:03:21.996 11:39:12 -- setup/common.sh@19 -- # local var val 00:03:21.996 11:39:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.996 11:39:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.996 11:39:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.996 11:39:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.996 11:39:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.996 11:39:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74774056 kB' 'MemAvailable: 78497240 kB' 'Buffers: 9664 kB' 'Cached: 12374884 kB' 'SwapCached: 0 kB' 'Active: 9396928 kB' 'Inactive: 3552388 kB' 'Active(anon): 8674020 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 568092 kB' 'Mapped: 197204 kB' 'Shmem: 8109252 kB' 'KReclaimable: 203020 kB' 'Slab: 580356 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377336 kB' 'KernelStack: 16032 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10011564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210612 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.996 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.996 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:21.997 11:39:12 -- setup/common.sh@33 -- # echo 0 00:03:21.997 11:39:12 -- setup/common.sh@33 -- # return 0 00:03:21.997 11:39:12 -- setup/hugepages.sh@100 -- # resv=0 00:03:21.997 11:39:12 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:21.997 nr_hugepages=1024 00:03:21.997 11:39:12 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:21.997 resv_hugepages=0 00:03:21.997 11:39:12 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:21.997 surplus_hugepages=0 00:03:21.997 11:39:12 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:21.997 anon_hugepages=0 00:03:21.997 11:39:12 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:21.997 11:39:12 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:21.997 11:39:12 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:21.997 11:39:12 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:21.997 11:39:12 -- setup/common.sh@18 -- # local node= 00:03:21.997 11:39:12 -- setup/common.sh@19 -- # local var val 00:03:21.997 11:39:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:21.997 11:39:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:21.997 11:39:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:21.997 11:39:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:21.997 11:39:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:21.997 11:39:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74774056 kB' 'MemAvailable: 78497240 kB' 'Buffers: 9664 kB' 'Cached: 12374896 kB' 'SwapCached: 0 kB' 'Active: 9396784 kB' 'Inactive: 3552388 kB' 'Active(anon): 8673876 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 567912 kB' 'Mapped: 197204 kB' 'Shmem: 8109264 kB' 'KReclaimable: 203020 kB' 'Slab: 580356 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377336 kB' 'KernelStack: 16016 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10011576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210612 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.997 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.997 11:39:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:21.998 11:39:12 -- setup/common.sh@32 -- # continue 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:21.998 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.259 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.259 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.259 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.259 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.259 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.259 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.259 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.259 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:22.260 11:39:12 -- setup/common.sh@33 -- # echo 1024 00:03:22.260 11:39:12 -- setup/common.sh@33 -- # return 0 00:03:22.260 11:39:12 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:22.260 11:39:12 -- setup/hugepages.sh@112 -- # get_nodes 00:03:22.260 11:39:12 -- setup/hugepages.sh@27 -- # local node 00:03:22.260 11:39:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:22.260 11:39:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:22.260 11:39:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:22.260 11:39:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:22.260 11:39:12 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:22.260 11:39:12 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:22.260 11:39:12 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:22.260 11:39:12 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:22.260 11:39:12 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:22.260 11:39:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:22.260 11:39:12 -- setup/common.sh@18 -- # local node=0 00:03:22.260 11:39:12 -- setup/common.sh@19 -- # local var val 00:03:22.260 11:39:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.260 11:39:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.260 11:39:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:22.260 11:39:12 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:22.260 11:39:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.260 11:39:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 41922992 kB' 'MemUsed: 6193972 kB' 'SwapCached: 0 kB' 'Active: 3186000 kB' 'Inactive: 140552 kB' 'Active(anon): 2926480 kB' 'Inactive(anon): 0 kB' 'Active(file): 259520 kB' 'Inactive(file): 140552 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3091988 kB' 'Mapped: 131352 kB' 'AnonPages: 237704 kB' 'Shmem: 2691916 kB' 'KernelStack: 8056 kB' 'PageTables: 4260 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 113616 kB' 'Slab: 350952 kB' 'SReclaimable: 113616 kB' 'SUnreclaim: 237336 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.260 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.260 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@33 -- # echo 0 00:03:22.261 11:39:12 -- setup/common.sh@33 -- # return 0 00:03:22.261 11:39:12 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:22.261 11:39:12 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:22.261 11:39:12 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:22.261 11:39:12 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:22.261 11:39:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:22.261 11:39:12 -- setup/common.sh@18 -- # local node=1 00:03:22.261 11:39:12 -- setup/common.sh@19 -- # local var val 00:03:22.261 11:39:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:22.261 11:39:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:22.261 11:39:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:22.261 11:39:12 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:22.261 11:39:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:22.261 11:39:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176572 kB' 'MemFree: 32851140 kB' 'MemUsed: 11325432 kB' 'SwapCached: 0 kB' 'Active: 6210664 kB' 'Inactive: 3411836 kB' 'Active(anon): 5747276 kB' 'Inactive(anon): 0 kB' 'Active(file): 463388 kB' 'Inactive(file): 3411836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9292596 kB' 'Mapped: 65852 kB' 'AnonPages: 330060 kB' 'Shmem: 5417372 kB' 'KernelStack: 7976 kB' 'PageTables: 4272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89404 kB' 'Slab: 229404 kB' 'SReclaimable: 89404 kB' 'SUnreclaim: 140000 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.261 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.261 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # continue 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:22.262 11:39:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:22.262 11:39:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:22.262 11:39:12 -- setup/common.sh@33 -- # echo 0 00:03:22.262 11:39:12 -- setup/common.sh@33 -- # return 0 00:03:22.262 11:39:12 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:22.262 11:39:12 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:22.262 11:39:12 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:22.262 11:39:12 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:22.262 11:39:12 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:22.262 node0=512 expecting 512 00:03:22.262 11:39:12 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:22.262 11:39:12 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:22.262 11:39:12 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:22.262 11:39:12 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:22.262 node1=512 expecting 512 00:03:22.262 11:39:12 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:22.262 00:03:22.262 real 0m5.561s 00:03:22.262 user 0m1.882s 00:03:22.262 sys 0m3.680s 00:03:22.262 11:39:12 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:22.262 11:39:12 -- common/autotest_common.sh@10 -- # set +x 00:03:22.262 ************************************ 00:03:22.262 END TEST even_2G_alloc 00:03:22.262 ************************************ 00:03:22.262 11:39:12 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:22.262 11:39:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:22.262 11:39:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:22.263 11:39:12 -- common/autotest_common.sh@10 -- # set +x 00:03:22.523 ************************************ 00:03:22.523 START TEST odd_alloc 00:03:22.523 ************************************ 00:03:22.523 11:39:12 -- common/autotest_common.sh@1111 -- # odd_alloc 00:03:22.523 11:39:12 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:22.523 11:39:12 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:22.523 11:39:12 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:22.523 11:39:12 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:22.523 11:39:12 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:22.523 11:39:12 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:22.523 11:39:12 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:22.523 11:39:12 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:22.523 11:39:12 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:22.523 11:39:12 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:22.523 11:39:12 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:22.523 11:39:12 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:22.523 11:39:12 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:22.523 11:39:12 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:22.523 11:39:12 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:22.523 11:39:12 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:22.523 11:39:12 -- setup/hugepages.sh@83 -- # : 513 00:03:22.523 11:39:12 -- setup/hugepages.sh@84 -- # : 1 00:03:22.523 11:39:12 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:22.523 11:39:12 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:22.523 11:39:12 -- setup/hugepages.sh@83 -- # : 0 00:03:22.523 11:39:12 -- setup/hugepages.sh@84 -- # : 0 00:03:22.523 11:39:12 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:22.523 11:39:12 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:22.523 11:39:12 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:22.523 11:39:12 -- setup/hugepages.sh@160 -- # setup output 00:03:22.523 11:39:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:22.523 11:39:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:25.922 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:25.922 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:25.922 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:25.923 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:27.834 11:39:17 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:27.834 11:39:17 -- setup/hugepages.sh@89 -- # local node 00:03:27.834 11:39:17 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:27.834 11:39:17 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:27.834 11:39:17 -- setup/hugepages.sh@92 -- # local surp 00:03:27.834 11:39:17 -- setup/hugepages.sh@93 -- # local resv 00:03:27.834 11:39:17 -- setup/hugepages.sh@94 -- # local anon 00:03:27.834 11:39:17 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:27.834 11:39:17 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:27.834 11:39:17 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:27.834 11:39:17 -- setup/common.sh@18 -- # local node= 00:03:27.834 11:39:17 -- setup/common.sh@19 -- # local var val 00:03:27.834 11:39:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:27.834 11:39:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:27.834 11:39:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:27.834 11:39:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:27.834 11:39:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:27.834 11:39:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.834 11:39:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74807972 kB' 'MemAvailable: 78531156 kB' 'Buffers: 9664 kB' 'Cached: 12375024 kB' 'SwapCached: 0 kB' 'Active: 9399044 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676136 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569628 kB' 'Mapped: 197476 kB' 'Shmem: 8109392 kB' 'KReclaimable: 203020 kB' 'Slab: 579860 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 376840 kB' 'KernelStack: 16048 kB' 'PageTables: 8620 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 10014324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210644 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.834 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.834 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.835 11:39:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:27.835 11:39:17 -- setup/common.sh@33 -- # echo 0 00:03:27.835 11:39:17 -- setup/common.sh@33 -- # return 0 00:03:27.835 11:39:17 -- setup/hugepages.sh@97 -- # anon=0 00:03:27.835 11:39:17 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:27.835 11:39:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:27.835 11:39:17 -- setup/common.sh@18 -- # local node= 00:03:27.835 11:39:17 -- setup/common.sh@19 -- # local var val 00:03:27.835 11:39:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:27.835 11:39:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:27.835 11:39:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:27.835 11:39:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:27.835 11:39:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:27.835 11:39:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:27.835 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74814576 kB' 'MemAvailable: 78537760 kB' 'Buffers: 9664 kB' 'Cached: 12375036 kB' 'SwapCached: 0 kB' 'Active: 9399664 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676756 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570288 kB' 'Mapped: 197480 kB' 'Shmem: 8109404 kB' 'KReclaimable: 203020 kB' 'Slab: 579836 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 376816 kB' 'KernelStack: 16240 kB' 'PageTables: 8932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 10012940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210692 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.836 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.836 11:39:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:17 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.837 11:39:18 -- setup/common.sh@33 -- # echo 0 00:03:27.837 11:39:18 -- setup/common.sh@33 -- # return 0 00:03:27.837 11:39:18 -- setup/hugepages.sh@99 -- # surp=0 00:03:27.837 11:39:18 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:27.837 11:39:18 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:27.837 11:39:18 -- setup/common.sh@18 -- # local node= 00:03:27.837 11:39:18 -- setup/common.sh@19 -- # local var val 00:03:27.837 11:39:18 -- setup/common.sh@20 -- # local mem_f mem 00:03:27.837 11:39:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:27.837 11:39:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:27.837 11:39:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:27.837 11:39:18 -- setup/common.sh@28 -- # mapfile -t mem 00:03:27.837 11:39:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74838212 kB' 'MemAvailable: 78561396 kB' 'Buffers: 9664 kB' 'Cached: 12375040 kB' 'SwapCached: 0 kB' 'Active: 9398076 kB' 'Inactive: 3552388 kB' 'Active(anon): 8675168 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569016 kB' 'Mapped: 197320 kB' 'Shmem: 8109408 kB' 'KReclaimable: 203020 kB' 'Slab: 579812 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 376792 kB' 'KernelStack: 16144 kB' 'PageTables: 8944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 10014352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210772 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.837 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.837 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.838 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:27.838 11:39:18 -- setup/common.sh@33 -- # echo 0 00:03:27.838 11:39:18 -- setup/common.sh@33 -- # return 0 00:03:27.838 11:39:18 -- setup/hugepages.sh@100 -- # resv=0 00:03:27.838 11:39:18 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:27.838 nr_hugepages=1025 00:03:27.838 11:39:18 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:27.838 resv_hugepages=0 00:03:27.838 11:39:18 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:27.838 surplus_hugepages=0 00:03:27.838 11:39:18 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:27.838 anon_hugepages=0 00:03:27.838 11:39:18 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:27.838 11:39:18 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:27.838 11:39:18 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:27.838 11:39:18 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:27.838 11:39:18 -- setup/common.sh@18 -- # local node= 00:03:27.838 11:39:18 -- setup/common.sh@19 -- # local var val 00:03:27.838 11:39:18 -- setup/common.sh@20 -- # local mem_f mem 00:03:27.838 11:39:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:27.838 11:39:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:27.838 11:39:18 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:27.838 11:39:18 -- setup/common.sh@28 -- # mapfile -t mem 00:03:27.838 11:39:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.838 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74838540 kB' 'MemAvailable: 78561724 kB' 'Buffers: 9664 kB' 'Cached: 12375052 kB' 'SwapCached: 0 kB' 'Active: 9398364 kB' 'Inactive: 3552388 kB' 'Active(anon): 8675456 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569256 kB' 'Mapped: 197320 kB' 'Shmem: 8109420 kB' 'KReclaimable: 203020 kB' 'Slab: 579684 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 376664 kB' 'KernelStack: 16160 kB' 'PageTables: 8780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485772 kB' 'Committed_AS: 10012972 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210756 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.839 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.839 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:27.840 11:39:18 -- setup/common.sh@33 -- # echo 1025 00:03:27.840 11:39:18 -- setup/common.sh@33 -- # return 0 00:03:27.840 11:39:18 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:27.840 11:39:18 -- setup/hugepages.sh@112 -- # get_nodes 00:03:27.840 11:39:18 -- setup/hugepages.sh@27 -- # local node 00:03:27.840 11:39:18 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:27.840 11:39:18 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:27.840 11:39:18 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:27.840 11:39:18 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:27.840 11:39:18 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:27.840 11:39:18 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:27.840 11:39:18 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:27.840 11:39:18 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:27.840 11:39:18 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:27.840 11:39:18 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:27.840 11:39:18 -- setup/common.sh@18 -- # local node=0 00:03:27.840 11:39:18 -- setup/common.sh@19 -- # local var val 00:03:27.840 11:39:18 -- setup/common.sh@20 -- # local mem_f mem 00:03:27.840 11:39:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:27.840 11:39:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:27.840 11:39:18 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:27.840 11:39:18 -- setup/common.sh@28 -- # mapfile -t mem 00:03:27.840 11:39:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 41930280 kB' 'MemUsed: 6186684 kB' 'SwapCached: 0 kB' 'Active: 3187328 kB' 'Inactive: 140552 kB' 'Active(anon): 2927808 kB' 'Inactive(anon): 0 kB' 'Active(file): 259520 kB' 'Inactive(file): 140552 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3092052 kB' 'Mapped: 131352 kB' 'AnonPages: 238996 kB' 'Shmem: 2691980 kB' 'KernelStack: 8024 kB' 'PageTables: 4328 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 113616 kB' 'Slab: 350528 kB' 'SReclaimable: 113616 kB' 'SUnreclaim: 236912 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.840 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.840 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@33 -- # echo 0 00:03:27.841 11:39:18 -- setup/common.sh@33 -- # return 0 00:03:27.841 11:39:18 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:27.841 11:39:18 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:27.841 11:39:18 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:27.841 11:39:18 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:27.841 11:39:18 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:27.841 11:39:18 -- setup/common.sh@18 -- # local node=1 00:03:27.841 11:39:18 -- setup/common.sh@19 -- # local var val 00:03:27.841 11:39:18 -- setup/common.sh@20 -- # local mem_f mem 00:03:27.841 11:39:18 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:27.841 11:39:18 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:27.841 11:39:18 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:27.841 11:39:18 -- setup/common.sh@28 -- # mapfile -t mem 00:03:27.841 11:39:18 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176572 kB' 'MemFree: 32905744 kB' 'MemUsed: 11270828 kB' 'SwapCached: 0 kB' 'Active: 6210548 kB' 'Inactive: 3411836 kB' 'Active(anon): 5747160 kB' 'Inactive(anon): 0 kB' 'Active(file): 463388 kB' 'Inactive(file): 3411836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9292680 kB' 'Mapped: 65968 kB' 'AnonPages: 330256 kB' 'Shmem: 5417456 kB' 'KernelStack: 8104 kB' 'PageTables: 4308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89404 kB' 'Slab: 229156 kB' 'SReclaimable: 89404 kB' 'SUnreclaim: 139752 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.841 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.841 11:39:18 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # continue 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # IFS=': ' 00:03:27.842 11:39:18 -- setup/common.sh@31 -- # read -r var val _ 00:03:27.842 11:39:18 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:27.842 11:39:18 -- setup/common.sh@33 -- # echo 0 00:03:27.842 11:39:18 -- setup/common.sh@33 -- # return 0 00:03:27.842 11:39:18 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:27.842 11:39:18 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:27.842 11:39:18 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:27.842 11:39:18 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:27.842 11:39:18 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:27.842 node0=512 expecting 513 00:03:27.842 11:39:18 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:27.842 11:39:18 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:27.842 11:39:18 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:27.842 11:39:18 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:27.842 node1=513 expecting 512 00:03:27.842 11:39:18 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:27.842 00:03:27.842 real 0m5.302s 00:03:27.842 user 0m1.686s 00:03:27.842 sys 0m3.593s 00:03:27.842 11:39:18 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:27.842 11:39:18 -- common/autotest_common.sh@10 -- # set +x 00:03:27.842 ************************************ 00:03:27.842 END TEST odd_alloc 00:03:27.842 ************************************ 00:03:27.842 11:39:18 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:27.842 11:39:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:27.842 11:39:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:27.842 11:39:18 -- common/autotest_common.sh@10 -- # set +x 00:03:27.842 ************************************ 00:03:27.842 START TEST custom_alloc 00:03:27.842 ************************************ 00:03:27.842 11:39:18 -- common/autotest_common.sh@1111 -- # custom_alloc 00:03:27.842 11:39:18 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:27.842 11:39:18 -- setup/hugepages.sh@169 -- # local node 00:03:27.842 11:39:18 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:27.842 11:39:18 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:27.842 11:39:18 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:27.842 11:39:18 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:27.842 11:39:18 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:27.842 11:39:18 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:27.842 11:39:18 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:27.842 11:39:18 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:27.842 11:39:18 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:27.842 11:39:18 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:27.842 11:39:18 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:27.842 11:39:18 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:27.842 11:39:18 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:27.843 11:39:18 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:27.843 11:39:18 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:27.843 11:39:18 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:27.843 11:39:18 -- setup/hugepages.sh@83 -- # : 256 00:03:27.843 11:39:18 -- setup/hugepages.sh@84 -- # : 1 00:03:27.843 11:39:18 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:27.843 11:39:18 -- setup/hugepages.sh@83 -- # : 0 00:03:27.843 11:39:18 -- setup/hugepages.sh@84 -- # : 0 00:03:27.843 11:39:18 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:27.843 11:39:18 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:27.843 11:39:18 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:27.843 11:39:18 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:27.843 11:39:18 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:27.843 11:39:18 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:27.843 11:39:18 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:27.843 11:39:18 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:27.843 11:39:18 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:27.843 11:39:18 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:27.843 11:39:18 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:27.843 11:39:18 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:27.843 11:39:18 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:27.843 11:39:18 -- setup/hugepages.sh@78 -- # return 0 00:03:27.843 11:39:18 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:27.843 11:39:18 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:27.843 11:39:18 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:27.843 11:39:18 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:27.843 11:39:18 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:27.843 11:39:18 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:27.843 11:39:18 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:27.843 11:39:18 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:27.843 11:39:18 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:27.843 11:39:18 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:27.843 11:39:18 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:27.843 11:39:18 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:27.843 11:39:18 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:27.843 11:39:18 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:27.843 11:39:18 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:27.843 11:39:18 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:27.843 11:39:18 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:27.843 11:39:18 -- setup/hugepages.sh@78 -- # return 0 00:03:27.843 11:39:18 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:27.843 11:39:18 -- setup/hugepages.sh@187 -- # setup output 00:03:27.843 11:39:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:27.843 11:39:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:31.144 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:31.144 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:31.144 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:33.056 11:39:23 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:33.056 11:39:23 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:33.056 11:39:23 -- setup/hugepages.sh@89 -- # local node 00:03:33.056 11:39:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:33.056 11:39:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:33.056 11:39:23 -- setup/hugepages.sh@92 -- # local surp 00:03:33.056 11:39:23 -- setup/hugepages.sh@93 -- # local resv 00:03:33.056 11:39:23 -- setup/hugepages.sh@94 -- # local anon 00:03:33.056 11:39:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:33.056 11:39:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:33.056 11:39:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:33.056 11:39:23 -- setup/common.sh@18 -- # local node= 00:03:33.056 11:39:23 -- setup/common.sh@19 -- # local var val 00:03:33.056 11:39:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.056 11:39:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.056 11:39:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.056 11:39:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.056 11:39:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.057 11:39:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 73819864 kB' 'MemAvailable: 77543048 kB' 'Buffers: 9664 kB' 'Cached: 12375180 kB' 'SwapCached: 0 kB' 'Active: 9399360 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676452 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570148 kB' 'Mapped: 197400 kB' 'Shmem: 8109548 kB' 'KReclaimable: 203020 kB' 'Slab: 580164 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377144 kB' 'KernelStack: 16192 kB' 'PageTables: 8924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 10013872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210788 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.057 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.057 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.058 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.058 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.058 11:39:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.058 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.058 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.058 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.058 11:39:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.058 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.058 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.058 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.322 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.322 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.323 11:39:23 -- setup/common.sh@33 -- # echo 0 00:03:33.323 11:39:23 -- setup/common.sh@33 -- # return 0 00:03:33.323 11:39:23 -- setup/hugepages.sh@97 -- # anon=0 00:03:33.323 11:39:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:33.323 11:39:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.323 11:39:23 -- setup/common.sh@18 -- # local node= 00:03:33.323 11:39:23 -- setup/common.sh@19 -- # local var val 00:03:33.323 11:39:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.323 11:39:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.323 11:39:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.323 11:39:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.323 11:39:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.323 11:39:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 73820008 kB' 'MemAvailable: 77543192 kB' 'Buffers: 9664 kB' 'Cached: 12375180 kB' 'SwapCached: 0 kB' 'Active: 9400344 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677436 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 571104 kB' 'Mapped: 197392 kB' 'Shmem: 8109548 kB' 'KReclaimable: 203020 kB' 'Slab: 580416 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377396 kB' 'KernelStack: 16272 kB' 'PageTables: 9072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 10013884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210868 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.323 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.323 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.324 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.324 11:39:23 -- setup/common.sh@33 -- # echo 0 00:03:33.324 11:39:23 -- setup/common.sh@33 -- # return 0 00:03:33.324 11:39:23 -- setup/hugepages.sh@99 -- # surp=0 00:03:33.324 11:39:23 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:33.324 11:39:23 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:33.324 11:39:23 -- setup/common.sh@18 -- # local node= 00:03:33.324 11:39:23 -- setup/common.sh@19 -- # local var val 00:03:33.324 11:39:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.324 11:39:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.324 11:39:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.324 11:39:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.324 11:39:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.324 11:39:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.324 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 73820796 kB' 'MemAvailable: 77543980 kB' 'Buffers: 9664 kB' 'Cached: 12375196 kB' 'SwapCached: 0 kB' 'Active: 9400040 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677132 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570808 kB' 'Mapped: 197392 kB' 'Shmem: 8109564 kB' 'KReclaimable: 203020 kB' 'Slab: 580476 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377456 kB' 'KernelStack: 16464 kB' 'PageTables: 9568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 10015296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210868 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.325 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.325 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.326 11:39:23 -- setup/common.sh@33 -- # echo 0 00:03:33.326 11:39:23 -- setup/common.sh@33 -- # return 0 00:03:33.326 11:39:23 -- setup/hugepages.sh@100 -- # resv=0 00:03:33.326 11:39:23 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:33.326 nr_hugepages=1536 00:03:33.326 11:39:23 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:33.326 resv_hugepages=0 00:03:33.326 11:39:23 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:33.326 surplus_hugepages=0 00:03:33.326 11:39:23 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:33.326 anon_hugepages=0 00:03:33.326 11:39:23 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:33.326 11:39:23 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:33.326 11:39:23 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:33.326 11:39:23 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:33.326 11:39:23 -- setup/common.sh@18 -- # local node= 00:03:33.326 11:39:23 -- setup/common.sh@19 -- # local var val 00:03:33.326 11:39:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.326 11:39:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.326 11:39:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.326 11:39:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.326 11:39:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.326 11:39:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 73821224 kB' 'MemAvailable: 77544408 kB' 'Buffers: 9664 kB' 'Cached: 12375196 kB' 'SwapCached: 0 kB' 'Active: 9400040 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677132 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570796 kB' 'Mapped: 197392 kB' 'Shmem: 8109564 kB' 'KReclaimable: 203020 kB' 'Slab: 580476 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377456 kB' 'KernelStack: 16384 kB' 'PageTables: 9500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962508 kB' 'Committed_AS: 10015308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210868 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.326 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.326 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.327 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.327 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.328 11:39:23 -- setup/common.sh@33 -- # echo 1536 00:03:33.328 11:39:23 -- setup/common.sh@33 -- # return 0 00:03:33.328 11:39:23 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:33.328 11:39:23 -- setup/hugepages.sh@112 -- # get_nodes 00:03:33.328 11:39:23 -- setup/hugepages.sh@27 -- # local node 00:03:33.328 11:39:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.328 11:39:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:33.328 11:39:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.328 11:39:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:33.328 11:39:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:33.328 11:39:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:33.328 11:39:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.328 11:39:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.328 11:39:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:33.328 11:39:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.328 11:39:23 -- setup/common.sh@18 -- # local node=0 00:03:33.328 11:39:23 -- setup/common.sh@19 -- # local var val 00:03:33.328 11:39:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.328 11:39:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.328 11:39:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:33.328 11:39:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:33.328 11:39:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.328 11:39:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 41952044 kB' 'MemUsed: 6164920 kB' 'SwapCached: 0 kB' 'Active: 3188520 kB' 'Inactive: 140552 kB' 'Active(anon): 2929000 kB' 'Inactive(anon): 0 kB' 'Active(file): 259520 kB' 'Inactive(file): 140552 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3092120 kB' 'Mapped: 131356 kB' 'AnonPages: 240008 kB' 'Shmem: 2692048 kB' 'KernelStack: 8424 kB' 'PageTables: 5284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 113616 kB' 'Slab: 351168 kB' 'SReclaimable: 113616 kB' 'SUnreclaim: 237552 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.328 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.328 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.329 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.329 11:39:23 -- setup/common.sh@33 -- # echo 0 00:03:33.329 11:39:23 -- setup/common.sh@33 -- # return 0 00:03:33.329 11:39:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.329 11:39:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.329 11:39:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.329 11:39:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:33.329 11:39:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.329 11:39:23 -- setup/common.sh@18 -- # local node=1 00:03:33.329 11:39:23 -- setup/common.sh@19 -- # local var val 00:03:33.329 11:39:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.329 11:39:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.329 11:39:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:33.329 11:39:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:33.329 11:39:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.329 11:39:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.329 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176572 kB' 'MemFree: 31868668 kB' 'MemUsed: 12307904 kB' 'SwapCached: 0 kB' 'Active: 6211456 kB' 'Inactive: 3411836 kB' 'Active(anon): 5748068 kB' 'Inactive(anon): 0 kB' 'Active(file): 463388 kB' 'Inactive(file): 3411836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9292780 kB' 'Mapped: 66000 kB' 'AnonPages: 330664 kB' 'Shmem: 5417556 kB' 'KernelStack: 7992 kB' 'PageTables: 4036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 89404 kB' 'Slab: 229308 kB' 'SReclaimable: 89404 kB' 'SUnreclaim: 139904 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.330 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.330 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # continue 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.331 11:39:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.331 11:39:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.331 11:39:23 -- setup/common.sh@33 -- # echo 0 00:03:33.331 11:39:23 -- setup/common.sh@33 -- # return 0 00:03:33.331 11:39:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.331 11:39:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.331 11:39:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.331 11:39:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.331 11:39:23 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:33.331 node0=512 expecting 512 00:03:33.331 11:39:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.331 11:39:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.331 11:39:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.331 11:39:23 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:33.331 node1=1024 expecting 1024 00:03:33.331 11:39:23 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:33.331 00:03:33.331 real 0m5.466s 00:03:33.331 user 0m1.829s 00:03:33.331 sys 0m3.665s 00:03:33.331 11:39:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:33.331 11:39:23 -- common/autotest_common.sh@10 -- # set +x 00:03:33.331 ************************************ 00:03:33.331 END TEST custom_alloc 00:03:33.331 ************************************ 00:03:33.331 11:39:23 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:33.331 11:39:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:33.331 11:39:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:33.331 11:39:23 -- common/autotest_common.sh@10 -- # set +x 00:03:33.592 ************************************ 00:03:33.592 START TEST no_shrink_alloc 00:03:33.592 ************************************ 00:03:33.592 11:39:23 -- common/autotest_common.sh@1111 -- # no_shrink_alloc 00:03:33.592 11:39:23 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:33.592 11:39:23 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:33.592 11:39:23 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:33.592 11:39:23 -- setup/hugepages.sh@51 -- # shift 00:03:33.592 11:39:23 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:33.592 11:39:23 -- setup/hugepages.sh@52 -- # local node_ids 00:03:33.592 11:39:23 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:33.592 11:39:23 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:33.592 11:39:23 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:33.592 11:39:23 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:33.592 11:39:23 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.592 11:39:23 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:33.592 11:39:23 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.592 11:39:23 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.592 11:39:23 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.592 11:39:23 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:33.592 11:39:23 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:33.592 11:39:23 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:33.592 11:39:23 -- setup/hugepages.sh@73 -- # return 0 00:03:33.592 11:39:23 -- setup/hugepages.sh@198 -- # setup output 00:03:33.592 11:39:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.592 11:39:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:37.793 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:37.793 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.793 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.178 11:39:29 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:39.178 11:39:29 -- setup/hugepages.sh@89 -- # local node 00:03:39.178 11:39:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:39.178 11:39:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:39.178 11:39:29 -- setup/hugepages.sh@92 -- # local surp 00:03:39.178 11:39:29 -- setup/hugepages.sh@93 -- # local resv 00:03:39.178 11:39:29 -- setup/hugepages.sh@94 -- # local anon 00:03:39.178 11:39:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:39.178 11:39:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:39.178 11:39:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:39.178 11:39:29 -- setup/common.sh@18 -- # local node= 00:03:39.178 11:39:29 -- setup/common.sh@19 -- # local var val 00:03:39.178 11:39:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.178 11:39:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.179 11:39:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.179 11:39:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.179 11:39:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.179 11:39:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74852032 kB' 'MemAvailable: 78575216 kB' 'Buffers: 9664 kB' 'Cached: 12375332 kB' 'SwapCached: 0 kB' 'Active: 9400104 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677196 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570816 kB' 'Mapped: 198520 kB' 'Shmem: 8109700 kB' 'KReclaimable: 203020 kB' 'Slab: 580424 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377404 kB' 'KernelStack: 16032 kB' 'PageTables: 8524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10016112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210804 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.179 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.179 11:39:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.180 11:39:29 -- setup/common.sh@33 -- # echo 0 00:03:39.180 11:39:29 -- setup/common.sh@33 -- # return 0 00:03:39.180 11:39:29 -- setup/hugepages.sh@97 -- # anon=0 00:03:39.180 11:39:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:39.180 11:39:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.180 11:39:29 -- setup/common.sh@18 -- # local node= 00:03:39.180 11:39:29 -- setup/common.sh@19 -- # local var val 00:03:39.180 11:39:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.180 11:39:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.180 11:39:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.180 11:39:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.180 11:39:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.180 11:39:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74851900 kB' 'MemAvailable: 78575084 kB' 'Buffers: 9664 kB' 'Cached: 12375332 kB' 'SwapCached: 0 kB' 'Active: 9400596 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677688 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 571292 kB' 'Mapped: 197512 kB' 'Shmem: 8109700 kB' 'KReclaimable: 203020 kB' 'Slab: 580396 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377376 kB' 'KernelStack: 15984 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10013692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210724 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.180 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.180 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.181 11:39:29 -- setup/common.sh@33 -- # echo 0 00:03:39.181 11:39:29 -- setup/common.sh@33 -- # return 0 00:03:39.181 11:39:29 -- setup/hugepages.sh@99 -- # surp=0 00:03:39.181 11:39:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:39.181 11:39:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:39.181 11:39:29 -- setup/common.sh@18 -- # local node= 00:03:39.181 11:39:29 -- setup/common.sh@19 -- # local var val 00:03:39.181 11:39:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.181 11:39:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.181 11:39:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.181 11:39:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.181 11:39:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.181 11:39:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74851820 kB' 'MemAvailable: 78575004 kB' 'Buffers: 9664 kB' 'Cached: 12375348 kB' 'SwapCached: 0 kB' 'Active: 9399852 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676944 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570900 kB' 'Mapped: 197436 kB' 'Shmem: 8109716 kB' 'KReclaimable: 203020 kB' 'Slab: 580412 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377392 kB' 'KernelStack: 16000 kB' 'PageTables: 8392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10013708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210724 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.181 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.181 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.182 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.182 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.182 11:39:29 -- setup/common.sh@33 -- # echo 0 00:03:39.182 11:39:29 -- setup/common.sh@33 -- # return 0 00:03:39.182 11:39:29 -- setup/hugepages.sh@100 -- # resv=0 00:03:39.182 11:39:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:39.182 nr_hugepages=1024 00:03:39.182 11:39:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:39.183 resv_hugepages=0 00:03:39.183 11:39:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:39.183 surplus_hugepages=0 00:03:39.183 11:39:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:39.183 anon_hugepages=0 00:03:39.183 11:39:29 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.183 11:39:29 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:39.183 11:39:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:39.183 11:39:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:39.183 11:39:29 -- setup/common.sh@18 -- # local node= 00:03:39.183 11:39:29 -- setup/common.sh@19 -- # local var val 00:03:39.183 11:39:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.183 11:39:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.183 11:39:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.183 11:39:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.183 11:39:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.183 11:39:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74851820 kB' 'MemAvailable: 78575004 kB' 'Buffers: 9664 kB' 'Cached: 12375360 kB' 'SwapCached: 0 kB' 'Active: 9400368 kB' 'Inactive: 3552388 kB' 'Active(anon): 8677460 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570900 kB' 'Mapped: 197436 kB' 'Shmem: 8109728 kB' 'KReclaimable: 203020 kB' 'Slab: 580412 kB' 'SReclaimable: 203020 kB' 'SUnreclaim: 377392 kB' 'KernelStack: 16000 kB' 'PageTables: 8392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10013724 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210724 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.183 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.183 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.184 11:39:29 -- setup/common.sh@33 -- # echo 1024 00:03:39.184 11:39:29 -- setup/common.sh@33 -- # return 0 00:03:39.184 11:39:29 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.184 11:39:29 -- setup/hugepages.sh@112 -- # get_nodes 00:03:39.184 11:39:29 -- setup/hugepages.sh@27 -- # local node 00:03:39.184 11:39:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.184 11:39:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:39.184 11:39:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.184 11:39:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:39.184 11:39:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.184 11:39:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.184 11:39:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.184 11:39:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.184 11:39:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:39.184 11:39:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.184 11:39:29 -- setup/common.sh@18 -- # local node=0 00:03:39.184 11:39:29 -- setup/common.sh@19 -- # local var val 00:03:39.184 11:39:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.184 11:39:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.184 11:39:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:39.184 11:39:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:39.184 11:39:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.184 11:39:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 40898948 kB' 'MemUsed: 7218016 kB' 'SwapCached: 0 kB' 'Active: 3188204 kB' 'Inactive: 140552 kB' 'Active(anon): 2928684 kB' 'Inactive(anon): 0 kB' 'Active(file): 259520 kB' 'Inactive(file): 140552 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3092188 kB' 'Mapped: 131328 kB' 'AnonPages: 239596 kB' 'Shmem: 2692116 kB' 'KernelStack: 7976 kB' 'PageTables: 4180 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 113616 kB' 'Slab: 350932 kB' 'SReclaimable: 113616 kB' 'SUnreclaim: 237316 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.184 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.184 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # continue 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.185 11:39:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.185 11:39:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.185 11:39:29 -- setup/common.sh@33 -- # echo 0 00:03:39.185 11:39:29 -- setup/common.sh@33 -- # return 0 00:03:39.185 11:39:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.185 11:39:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.185 11:39:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.185 11:39:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.185 11:39:29 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:39.185 node0=1024 expecting 1024 00:03:39.185 11:39:29 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:39.185 11:39:29 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:39.185 11:39:29 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:39.185 11:39:29 -- setup/hugepages.sh@202 -- # setup output 00:03:39.185 11:39:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.185 11:39:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:42.478 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:42.478 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.478 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:44.387 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:44.388 11:39:34 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:44.388 11:39:34 -- setup/hugepages.sh@89 -- # local node 00:03:44.388 11:39:34 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:44.388 11:39:34 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:44.388 11:39:34 -- setup/hugepages.sh@92 -- # local surp 00:03:44.388 11:39:34 -- setup/hugepages.sh@93 -- # local resv 00:03:44.388 11:39:34 -- setup/hugepages.sh@94 -- # local anon 00:03:44.388 11:39:34 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:44.388 11:39:34 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:44.388 11:39:34 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:44.388 11:39:34 -- setup/common.sh@18 -- # local node= 00:03:44.388 11:39:34 -- setup/common.sh@19 -- # local var val 00:03:44.388 11:39:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.388 11:39:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.388 11:39:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.388 11:39:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.388 11:39:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.388 11:39:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74864412 kB' 'MemAvailable: 78587612 kB' 'Buffers: 9664 kB' 'Cached: 12375464 kB' 'SwapCached: 0 kB' 'Active: 9399396 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676488 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569436 kB' 'Mapped: 197604 kB' 'Shmem: 8109832 kB' 'KReclaimable: 203052 kB' 'Slab: 581200 kB' 'SReclaimable: 203052 kB' 'SUnreclaim: 378148 kB' 'KernelStack: 16048 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10014452 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210772 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.388 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.388 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:44.389 11:39:34 -- setup/common.sh@33 -- # echo 0 00:03:44.389 11:39:34 -- setup/common.sh@33 -- # return 0 00:03:44.389 11:39:34 -- setup/hugepages.sh@97 -- # anon=0 00:03:44.389 11:39:34 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:44.389 11:39:34 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.389 11:39:34 -- setup/common.sh@18 -- # local node= 00:03:44.389 11:39:34 -- setup/common.sh@19 -- # local var val 00:03:44.389 11:39:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.389 11:39:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.389 11:39:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.389 11:39:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.389 11:39:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.389 11:39:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74864140 kB' 'MemAvailable: 78587340 kB' 'Buffers: 9664 kB' 'Cached: 12375468 kB' 'SwapCached: 0 kB' 'Active: 9399600 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676692 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569660 kB' 'Mapped: 197604 kB' 'Shmem: 8109836 kB' 'KReclaimable: 203052 kB' 'Slab: 581200 kB' 'SReclaimable: 203052 kB' 'SUnreclaim: 378148 kB' 'KernelStack: 16048 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10014464 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210740 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.389 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.389 11:39:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.390 11:39:34 -- setup/common.sh@33 -- # echo 0 00:03:44.390 11:39:34 -- setup/common.sh@33 -- # return 0 00:03:44.390 11:39:34 -- setup/hugepages.sh@99 -- # surp=0 00:03:44.390 11:39:34 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:44.390 11:39:34 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:44.390 11:39:34 -- setup/common.sh@18 -- # local node= 00:03:44.390 11:39:34 -- setup/common.sh@19 -- # local var val 00:03:44.390 11:39:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.390 11:39:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.390 11:39:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.390 11:39:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.390 11:39:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.390 11:39:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74864572 kB' 'MemAvailable: 78587772 kB' 'Buffers: 9664 kB' 'Cached: 12375476 kB' 'SwapCached: 0 kB' 'Active: 9399388 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676480 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569948 kB' 'Mapped: 197528 kB' 'Shmem: 8109844 kB' 'KReclaimable: 203052 kB' 'Slab: 581152 kB' 'SReclaimable: 203052 kB' 'SUnreclaim: 378100 kB' 'KernelStack: 16048 kB' 'PageTables: 8540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10014480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210756 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.390 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.390 11:39:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.391 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.391 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:44.392 11:39:34 -- setup/common.sh@33 -- # echo 0 00:03:44.392 11:39:34 -- setup/common.sh@33 -- # return 0 00:03:44.392 11:39:34 -- setup/hugepages.sh@100 -- # resv=0 00:03:44.392 11:39:34 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:44.392 nr_hugepages=1024 00:03:44.392 11:39:34 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:44.392 resv_hugepages=0 00:03:44.392 11:39:34 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:44.392 surplus_hugepages=0 00:03:44.392 11:39:34 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:44.392 anon_hugepages=0 00:03:44.392 11:39:34 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:44.392 11:39:34 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:44.392 11:39:34 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:44.392 11:39:34 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:44.392 11:39:34 -- setup/common.sh@18 -- # local node= 00:03:44.392 11:39:34 -- setup/common.sh@19 -- # local var val 00:03:44.392 11:39:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.392 11:39:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.392 11:39:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:44.392 11:39:34 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:44.392 11:39:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.392 11:39:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293536 kB' 'MemFree: 74865476 kB' 'MemAvailable: 78588676 kB' 'Buffers: 9664 kB' 'Cached: 12375492 kB' 'SwapCached: 0 kB' 'Active: 9399372 kB' 'Inactive: 3552388 kB' 'Active(anon): 8676464 kB' 'Inactive(anon): 0 kB' 'Active(file): 722908 kB' 'Inactive(file): 3552388 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 569912 kB' 'Mapped: 197528 kB' 'Shmem: 8109860 kB' 'KReclaimable: 203052 kB' 'Slab: 581152 kB' 'SReclaimable: 203052 kB' 'SUnreclaim: 378100 kB' 'KernelStack: 16032 kB' 'PageTables: 8488 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486796 kB' 'Committed_AS: 10014496 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210756 kB' 'VmallocChunk: 0 kB' 'Percpu: 53120 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 456124 kB' 'DirectMap2M: 13899776 kB' 'DirectMap1G: 87031808 kB' 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.392 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.392 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:44.393 11:39:34 -- setup/common.sh@33 -- # echo 1024 00:03:44.393 11:39:34 -- setup/common.sh@33 -- # return 0 00:03:44.393 11:39:34 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:44.393 11:39:34 -- setup/hugepages.sh@112 -- # get_nodes 00:03:44.393 11:39:34 -- setup/hugepages.sh@27 -- # local node 00:03:44.393 11:39:34 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.393 11:39:34 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:44.393 11:39:34 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:44.393 11:39:34 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:44.393 11:39:34 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:44.393 11:39:34 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:44.393 11:39:34 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:44.393 11:39:34 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:44.393 11:39:34 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:44.393 11:39:34 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:44.393 11:39:34 -- setup/common.sh@18 -- # local node=0 00:03:44.393 11:39:34 -- setup/common.sh@19 -- # local var val 00:03:44.393 11:39:34 -- setup/common.sh@20 -- # local mem_f mem 00:03:44.393 11:39:34 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:44.393 11:39:34 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:44.393 11:39:34 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:44.393 11:39:34 -- setup/common.sh@28 -- # mapfile -t mem 00:03:44.393 11:39:34 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:44.393 11:39:34 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 40906356 kB' 'MemUsed: 7210608 kB' 'SwapCached: 0 kB' 'Active: 3187628 kB' 'Inactive: 140552 kB' 'Active(anon): 2928108 kB' 'Inactive(anon): 0 kB' 'Active(file): 259520 kB' 'Inactive(file): 140552 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3092276 kB' 'Mapped: 131364 kB' 'AnonPages: 239088 kB' 'Shmem: 2692204 kB' 'KernelStack: 8024 kB' 'PageTables: 4344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 113616 kB' 'Slab: 351000 kB' 'SReclaimable: 113616 kB' 'SUnreclaim: 237384 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.393 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.393 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # continue 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # IFS=': ' 00:03:44.394 11:39:34 -- setup/common.sh@31 -- # read -r var val _ 00:03:44.394 11:39:34 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:44.394 11:39:34 -- setup/common.sh@33 -- # echo 0 00:03:44.394 11:39:34 -- setup/common.sh@33 -- # return 0 00:03:44.394 11:39:34 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:44.394 11:39:34 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:44.394 11:39:34 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:44.394 11:39:34 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:44.394 11:39:34 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:44.394 node0=1024 expecting 1024 00:03:44.394 11:39:34 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:44.394 00:03:44.394 real 0m10.937s 00:03:44.394 user 0m3.780s 00:03:44.394 sys 0m7.222s 00:03:44.394 11:39:34 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:44.394 11:39:34 -- common/autotest_common.sh@10 -- # set +x 00:03:44.394 ************************************ 00:03:44.394 END TEST no_shrink_alloc 00:03:44.394 ************************************ 00:03:44.394 11:39:34 -- setup/hugepages.sh@217 -- # clear_hp 00:03:44.394 11:39:34 -- setup/hugepages.sh@37 -- # local node hp 00:03:44.394 11:39:34 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:44.394 11:39:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:44.654 11:39:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:44.654 11:39:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:44.654 11:39:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:44.654 11:39:34 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:44.654 11:39:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:44.654 11:39:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:44.654 11:39:34 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:44.654 11:39:34 -- setup/hugepages.sh@41 -- # echo 0 00:03:44.654 11:39:34 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:44.654 11:39:34 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:44.654 00:03:44.654 real 0m43.280s 00:03:44.654 user 0m13.716s 00:03:44.654 sys 0m26.485s 00:03:44.654 11:39:34 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:44.654 11:39:34 -- common/autotest_common.sh@10 -- # set +x 00:03:44.654 ************************************ 00:03:44.654 END TEST hugepages 00:03:44.654 ************************************ 00:03:44.654 11:39:34 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:44.654 11:39:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:44.654 11:39:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:44.654 11:39:34 -- common/autotest_common.sh@10 -- # set +x 00:03:44.654 ************************************ 00:03:44.654 START TEST driver 00:03:44.654 ************************************ 00:03:44.654 11:39:35 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:44.913 * Looking for test storage... 00:03:44.913 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:44.913 11:39:35 -- setup/driver.sh@68 -- # setup reset 00:03:44.913 11:39:35 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:44.913 11:39:35 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:53.038 11:39:42 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:53.038 11:39:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:53.038 11:39:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:53.038 11:39:42 -- common/autotest_common.sh@10 -- # set +x 00:03:53.038 ************************************ 00:03:53.038 START TEST guess_driver 00:03:53.038 ************************************ 00:03:53.038 11:39:42 -- common/autotest_common.sh@1111 -- # guess_driver 00:03:53.038 11:39:42 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:53.038 11:39:42 -- setup/driver.sh@47 -- # local fail=0 00:03:53.038 11:39:42 -- setup/driver.sh@49 -- # pick_driver 00:03:53.038 11:39:42 -- setup/driver.sh@36 -- # vfio 00:03:53.038 11:39:42 -- setup/driver.sh@21 -- # local iommu_grups 00:03:53.038 11:39:42 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:53.038 11:39:42 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:53.038 11:39:42 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:53.038 11:39:42 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:53.038 11:39:42 -- setup/driver.sh@29 -- # (( 238 > 0 )) 00:03:53.038 11:39:42 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:53.038 11:39:42 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:53.038 11:39:42 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:53.038 11:39:42 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:53.038 11:39:42 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:53.038 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:53.038 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:53.038 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:53.038 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:53.038 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:53.038 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:53.038 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:53.038 11:39:42 -- setup/driver.sh@30 -- # return 0 00:03:53.038 11:39:42 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:53.038 11:39:42 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:53.038 11:39:42 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:53.038 11:39:42 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:53.038 Looking for driver=vfio-pci 00:03:53.038 11:39:42 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:53.038 11:39:42 -- setup/driver.sh@45 -- # setup output config 00:03:53.038 11:39:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.038 11:39:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:45 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:45 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:45 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.578 11:39:46 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.578 11:39:46 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.578 11:39:46 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:58.876 11:39:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:58.876 11:39:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:58.876 11:39:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:00.782 11:39:51 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:00.782 11:39:51 -- setup/driver.sh@65 -- # setup reset 00:04:00.782 11:39:51 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:00.782 11:39:51 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:09.011 00:04:09.011 real 0m15.845s 00:04:09.011 user 0m4.064s 00:04:09.011 sys 0m7.966s 00:04:09.011 11:39:58 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:09.011 11:39:58 -- common/autotest_common.sh@10 -- # set +x 00:04:09.011 ************************************ 00:04:09.011 END TEST guess_driver 00:04:09.011 ************************************ 00:04:09.011 00:04:09.011 real 0m22.952s 00:04:09.011 user 0m6.171s 00:04:09.011 sys 0m12.179s 00:04:09.011 11:39:58 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:09.011 11:39:58 -- common/autotest_common.sh@10 -- # set +x 00:04:09.011 ************************************ 00:04:09.011 END TEST driver 00:04:09.011 ************************************ 00:04:09.011 11:39:58 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:09.011 11:39:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:09.011 11:39:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:09.011 11:39:58 -- common/autotest_common.sh@10 -- # set +x 00:04:09.011 ************************************ 00:04:09.011 START TEST devices 00:04:09.011 ************************************ 00:04:09.011 11:39:58 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:09.011 * Looking for test storage... 00:04:09.011 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:09.011 11:39:58 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:09.011 11:39:58 -- setup/devices.sh@192 -- # setup reset 00:04:09.011 11:39:58 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:09.011 11:39:58 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:14.291 11:40:04 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:14.291 11:40:04 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:14.291 11:40:04 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:14.291 11:40:04 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:14.291 11:40:04 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:14.291 11:40:04 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:14.291 11:40:04 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:14.291 11:40:04 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:14.291 11:40:04 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:14.291 11:40:04 -- setup/devices.sh@196 -- # blocks=() 00:04:14.291 11:40:04 -- setup/devices.sh@196 -- # declare -a blocks 00:04:14.291 11:40:04 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:14.291 11:40:04 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:14.291 11:40:04 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:14.291 11:40:04 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:14.291 11:40:04 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:14.291 11:40:04 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:14.291 11:40:04 -- setup/devices.sh@202 -- # pci=0000:1a:00.0 00:04:14.291 11:40:04 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:04:14.291 11:40:04 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:14.291 11:40:04 -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:14.291 11:40:04 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:14.291 No valid GPT data, bailing 00:04:14.291 11:40:04 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:14.291 11:40:04 -- scripts/common.sh@391 -- # pt= 00:04:14.291 11:40:04 -- scripts/common.sh@392 -- # return 1 00:04:14.291 11:40:04 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:14.291 11:40:04 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:14.291 11:40:04 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:14.291 11:40:04 -- setup/common.sh@80 -- # echo 4000787030016 00:04:14.291 11:40:04 -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:04:14.291 11:40:04 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:14.291 11:40:04 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:1a:00.0 00:04:14.291 11:40:04 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:14.291 11:40:04 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:14.291 11:40:04 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:14.291 11:40:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:14.291 11:40:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:14.291 11:40:04 -- common/autotest_common.sh@10 -- # set +x 00:04:14.291 ************************************ 00:04:14.291 START TEST nvme_mount 00:04:14.291 ************************************ 00:04:14.291 11:40:04 -- common/autotest_common.sh@1111 -- # nvme_mount 00:04:14.291 11:40:04 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:14.291 11:40:04 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:14.291 11:40:04 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:14.291 11:40:04 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:14.291 11:40:04 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:14.291 11:40:04 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:14.291 11:40:04 -- setup/common.sh@40 -- # local part_no=1 00:04:14.291 11:40:04 -- setup/common.sh@41 -- # local size=1073741824 00:04:14.291 11:40:04 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:14.291 11:40:04 -- setup/common.sh@44 -- # parts=() 00:04:14.291 11:40:04 -- setup/common.sh@44 -- # local parts 00:04:14.291 11:40:04 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:14.291 11:40:04 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:14.291 11:40:04 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:14.291 11:40:04 -- setup/common.sh@46 -- # (( part++ )) 00:04:14.291 11:40:04 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:14.291 11:40:04 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:14.291 11:40:04 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:14.291 11:40:04 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:15.230 Creating new GPT entries in memory. 00:04:15.230 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:15.230 other utilities. 00:04:15.230 11:40:05 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:15.230 11:40:05 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:15.230 11:40:05 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:15.230 11:40:05 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:15.230 11:40:05 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:16.171 Creating new GPT entries in memory. 00:04:16.171 The operation has completed successfully. 00:04:16.171 11:40:06 -- setup/common.sh@57 -- # (( part++ )) 00:04:16.171 11:40:06 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:16.171 11:40:06 -- setup/common.sh@62 -- # wait 320285 00:04:16.171 11:40:06 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:16.171 11:40:06 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:16.171 11:40:06 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:16.171 11:40:06 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:16.171 11:40:06 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:16.171 11:40:06 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:16.171 11:40:06 -- setup/devices.sh@105 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:16.171 11:40:06 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:16.171 11:40:06 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:16.171 11:40:06 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:16.171 11:40:06 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:16.171 11:40:06 -- setup/devices.sh@53 -- # local found=0 00:04:16.171 11:40:06 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:16.171 11:40:06 -- setup/devices.sh@56 -- # : 00:04:16.171 11:40:06 -- setup/devices.sh@59 -- # local pci status 00:04:16.171 11:40:06 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:16.171 11:40:06 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:16.171 11:40:06 -- setup/devices.sh@47 -- # setup output config 00:04:16.171 11:40:06 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.171 11:40:06 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:20.371 11:40:10 -- setup/devices.sh@63 -- # found=1 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.371 11:40:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.371 11:40:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.750 11:40:12 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:21.750 11:40:12 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:21.750 11:40:12 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.750 11:40:12 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:21.750 11:40:12 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:21.750 11:40:12 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:21.750 11:40:12 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.750 11:40:12 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:21.750 11:40:12 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:21.750 11:40:12 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:21.750 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:21.750 11:40:12 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:21.750 11:40:12 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:22.009 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:22.009 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:22.009 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:22.009 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:22.009 11:40:12 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:22.009 11:40:12 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:22.009 11:40:12 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.009 11:40:12 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:22.009 11:40:12 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:22.270 11:40:12 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.270 11:40:12 -- setup/devices.sh@116 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.270 11:40:12 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:22.270 11:40:12 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:22.270 11:40:12 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.270 11:40:12 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.270 11:40:12 -- setup/devices.sh@53 -- # local found=0 00:04:22.270 11:40:12 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:22.270 11:40:12 -- setup/devices.sh@56 -- # : 00:04:22.270 11:40:12 -- setup/devices.sh@59 -- # local pci status 00:04:22.270 11:40:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.270 11:40:12 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:22.270 11:40:12 -- setup/devices.sh@47 -- # setup output config 00:04:22.270 11:40:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.270 11:40:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:25.578 11:40:15 -- setup/devices.sh@63 -- # found=1 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.578 11:40:15 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.578 11:40:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.485 11:40:17 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:27.485 11:40:17 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:27.485 11:40:17 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:27.485 11:40:17 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:27.485 11:40:17 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:27.485 11:40:17 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:27.485 11:40:17 -- setup/devices.sh@125 -- # verify 0000:1a:00.0 data@nvme0n1 '' '' 00:04:27.485 11:40:17 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:27.485 11:40:17 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:27.485 11:40:17 -- setup/devices.sh@50 -- # local mount_point= 00:04:27.485 11:40:17 -- setup/devices.sh@51 -- # local test_file= 00:04:27.485 11:40:17 -- setup/devices.sh@53 -- # local found=0 00:04:27.485 11:40:17 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:27.485 11:40:17 -- setup/devices.sh@59 -- # local pci status 00:04:27.485 11:40:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.485 11:40:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:27.485 11:40:17 -- setup/devices.sh@47 -- # setup output config 00:04:27.485 11:40:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:27.485 11:40:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:31.681 11:40:21 -- setup/devices.sh@63 -- # found=1 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.681 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.681 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.682 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.682 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.682 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.682 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.682 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.682 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.682 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.682 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.682 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.682 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.682 11:40:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:31.682 11:40:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.062 11:40:23 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:33.063 11:40:23 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:33.063 11:40:23 -- setup/devices.sh@68 -- # return 0 00:04:33.063 11:40:23 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:33.063 11:40:23 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:33.063 11:40:23 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:33.063 11:40:23 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:33.063 11:40:23 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:33.063 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:33.063 00:04:33.063 real 0m18.918s 00:04:33.063 user 0m5.564s 00:04:33.063 sys 0m11.059s 00:04:33.063 11:40:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:33.063 11:40:23 -- common/autotest_common.sh@10 -- # set +x 00:04:33.063 ************************************ 00:04:33.063 END TEST nvme_mount 00:04:33.063 ************************************ 00:04:33.063 11:40:23 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:33.063 11:40:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:33.063 11:40:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:33.063 11:40:23 -- common/autotest_common.sh@10 -- # set +x 00:04:33.322 ************************************ 00:04:33.322 START TEST dm_mount 00:04:33.322 ************************************ 00:04:33.322 11:40:23 -- common/autotest_common.sh@1111 -- # dm_mount 00:04:33.323 11:40:23 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:33.323 11:40:23 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:33.323 11:40:23 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:33.323 11:40:23 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:33.323 11:40:23 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:33.323 11:40:23 -- setup/common.sh@40 -- # local part_no=2 00:04:33.323 11:40:23 -- setup/common.sh@41 -- # local size=1073741824 00:04:33.323 11:40:23 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:33.323 11:40:23 -- setup/common.sh@44 -- # parts=() 00:04:33.323 11:40:23 -- setup/common.sh@44 -- # local parts 00:04:33.323 11:40:23 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:33.323 11:40:23 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:33.323 11:40:23 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:33.323 11:40:23 -- setup/common.sh@46 -- # (( part++ )) 00:04:33.323 11:40:23 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:33.323 11:40:23 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:33.323 11:40:23 -- setup/common.sh@46 -- # (( part++ )) 00:04:33.323 11:40:23 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:33.323 11:40:23 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:33.323 11:40:23 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:33.323 11:40:23 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:34.261 Creating new GPT entries in memory. 00:04:34.261 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:34.262 other utilities. 00:04:34.262 11:40:24 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:34.262 11:40:24 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:34.262 11:40:24 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:34.262 11:40:24 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:34.262 11:40:24 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:35.201 Creating new GPT entries in memory. 00:04:35.201 The operation has completed successfully. 00:04:35.201 11:40:25 -- setup/common.sh@57 -- # (( part++ )) 00:04:35.201 11:40:25 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:35.201 11:40:25 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:35.201 11:40:25 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:35.201 11:40:25 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:36.585 The operation has completed successfully. 00:04:36.585 11:40:26 -- setup/common.sh@57 -- # (( part++ )) 00:04:36.585 11:40:26 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:36.585 11:40:26 -- setup/common.sh@62 -- # wait 325479 00:04:36.585 11:40:26 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:36.585 11:40:26 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.585 11:40:26 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:36.585 11:40:26 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:36.585 11:40:26 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:36.585 11:40:26 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:36.585 11:40:26 -- setup/devices.sh@161 -- # break 00:04:36.585 11:40:26 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:36.585 11:40:26 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:36.585 11:40:26 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:36.585 11:40:26 -- setup/devices.sh@166 -- # dm=dm-0 00:04:36.585 11:40:26 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:36.585 11:40:26 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:36.585 11:40:26 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.585 11:40:26 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:36.585 11:40:26 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.585 11:40:26 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:36.585 11:40:26 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:36.585 11:40:26 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.585 11:40:26 -- setup/devices.sh@174 -- # verify 0000:1a:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:36.585 11:40:26 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:36.585 11:40:26 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:36.585 11:40:26 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.585 11:40:26 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:36.585 11:40:26 -- setup/devices.sh@53 -- # local found=0 00:04:36.585 11:40:26 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:36.585 11:40:26 -- setup/devices.sh@56 -- # : 00:04:36.585 11:40:26 -- setup/devices.sh@59 -- # local pci status 00:04:36.585 11:40:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.585 11:40:26 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:36.585 11:40:26 -- setup/devices.sh@47 -- # setup output config 00:04:36.585 11:40:26 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.585 11:40:26 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:39.882 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.882 11:40:30 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:39.882 11:40:30 -- setup/devices.sh@63 -- # found=1 00:04:39.882 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.882 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.882 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.882 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.883 11:40:30 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.883 11:40:30 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.789 11:40:32 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:41.789 11:40:32 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:41.789 11:40:32 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:41.789 11:40:32 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:41.789 11:40:32 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:41.789 11:40:32 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:41.789 11:40:32 -- setup/devices.sh@184 -- # verify 0000:1a:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:41.789 11:40:32 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:41.789 11:40:32 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:41.789 11:40:32 -- setup/devices.sh@50 -- # local mount_point= 00:04:41.789 11:40:32 -- setup/devices.sh@51 -- # local test_file= 00:04:41.789 11:40:32 -- setup/devices.sh@53 -- # local found=0 00:04:41.789 11:40:32 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:41.789 11:40:32 -- setup/devices.sh@59 -- # local pci status 00:04:41.789 11:40:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.789 11:40:32 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:41.789 11:40:32 -- setup/devices.sh@47 -- # setup output config 00:04:41.789 11:40:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.789 11:40:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:45.988 11:40:35 -- setup/devices.sh@63 -- # found=1 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.988 11:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:45.988 11:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:47.457 11:40:37 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:47.457 11:40:37 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:47.457 11:40:37 -- setup/devices.sh@68 -- # return 0 00:04:47.457 11:40:37 -- setup/devices.sh@187 -- # cleanup_dm 00:04:47.457 11:40:37 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:47.457 11:40:37 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:47.457 11:40:37 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:47.457 11:40:37 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:47.457 11:40:37 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:47.457 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:47.457 11:40:37 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:47.457 11:40:37 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:47.457 00:04:47.457 real 0m14.083s 00:04:47.457 user 0m3.674s 00:04:47.457 sys 0m7.371s 00:04:47.457 11:40:37 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:47.457 11:40:37 -- common/autotest_common.sh@10 -- # set +x 00:04:47.457 ************************************ 00:04:47.457 END TEST dm_mount 00:04:47.457 ************************************ 00:04:47.457 11:40:37 -- setup/devices.sh@1 -- # cleanup 00:04:47.457 11:40:37 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:47.457 11:40:37 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:47.457 11:40:37 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:47.457 11:40:37 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:47.457 11:40:37 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:47.457 11:40:37 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:47.718 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:47.718 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:47.718 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:47.718 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:47.718 11:40:38 -- setup/devices.sh@12 -- # cleanup_dm 00:04:47.718 11:40:38 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:47.718 11:40:38 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:47.718 11:40:38 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:47.718 11:40:38 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:47.718 11:40:38 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:47.718 11:40:38 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:47.718 00:04:47.718 real 0m39.810s 00:04:47.718 user 0m11.468s 00:04:47.718 sys 0m22.824s 00:04:47.718 11:40:38 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:47.718 11:40:38 -- common/autotest_common.sh@10 -- # set +x 00:04:47.718 ************************************ 00:04:47.718 END TEST devices 00:04:47.718 ************************************ 00:04:47.718 00:04:47.718 real 2m24.258s 00:04:47.718 user 0m43.138s 00:04:47.718 sys 1m24.129s 00:04:47.718 11:40:38 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:47.718 11:40:38 -- common/autotest_common.sh@10 -- # set +x 00:04:47.718 ************************************ 00:04:47.718 END TEST setup.sh 00:04:47.718 ************************************ 00:04:47.718 11:40:38 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:51.918 Hugepages 00:04:51.918 node hugesize free / total 00:04:51.918 node0 1048576kB 0 / 0 00:04:51.918 node0 2048kB 2048 / 2048 00:04:51.918 node1 1048576kB 0 / 0 00:04:51.918 node1 2048kB 0 / 0 00:04:51.918 00:04:51.918 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:51.918 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:51.918 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:51.918 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:51.918 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:51.918 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:51.918 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:51.918 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:51.918 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:51.918 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:04:51.918 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:51.918 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:51.918 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:51.918 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:51.918 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:51.918 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:51.918 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:51.918 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:51.918 11:40:41 -- spdk/autotest.sh@130 -- # uname -s 00:04:51.918 11:40:41 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:51.918 11:40:41 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:51.918 11:40:41 -- common/autotest_common.sh@1517 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:55.210 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:55.210 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:58.501 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:05:00.404 11:40:50 -- common/autotest_common.sh@1518 -- # sleep 1 00:05:01.340 11:40:51 -- common/autotest_common.sh@1519 -- # bdfs=() 00:05:01.341 11:40:51 -- common/autotest_common.sh@1519 -- # local bdfs 00:05:01.341 11:40:51 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:05:01.341 11:40:51 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:05:01.341 11:40:51 -- common/autotest_common.sh@1499 -- # bdfs=() 00:05:01.341 11:40:51 -- common/autotest_common.sh@1499 -- # local bdfs 00:05:01.341 11:40:51 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:01.341 11:40:51 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:01.341 11:40:51 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:05:01.341 11:40:51 -- common/autotest_common.sh@1501 -- # (( 1 == 0 )) 00:05:01.341 11:40:51 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:1a:00.0 00:05:01.341 11:40:51 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:05.539 Waiting for block devices as requested 00:05:05.539 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:05:05.539 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:05.539 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:05.539 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:05.539 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:05.539 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:05.539 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:05.798 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:05.798 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:05.798 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:06.057 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:06.057 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:06.057 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:06.057 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:06.317 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:06.317 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:06.317 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:08.222 11:40:58 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:05:08.222 11:40:58 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:1a:00.0 00:05:08.222 11:40:58 -- common/autotest_common.sh@1488 -- # readlink -f /sys/class/nvme/nvme0 00:05:08.222 11:40:58 -- common/autotest_common.sh@1488 -- # grep 0000:1a:00.0/nvme/nvme 00:05:08.222 11:40:58 -- common/autotest_common.sh@1488 -- # bdf_sysfs_path=/sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:05:08.222 11:40:58 -- common/autotest_common.sh@1489 -- # [[ -z /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 ]] 00:05:08.222 11:40:58 -- common/autotest_common.sh@1493 -- # basename /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:05:08.222 11:40:58 -- common/autotest_common.sh@1493 -- # printf '%s\n' nvme0 00:05:08.222 11:40:58 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:05:08.222 11:40:58 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:05:08.481 11:40:58 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:05:08.481 11:40:58 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:05:08.481 11:40:58 -- common/autotest_common.sh@1531 -- # grep oacs 00:05:08.481 11:40:58 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:05:08.481 11:40:58 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:05:08.481 11:40:58 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:05:08.481 11:40:58 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:08.481 11:40:58 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:08.481 11:40:58 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:05:08.481 11:40:58 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:05:08.481 11:40:58 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:05:08.481 11:40:58 -- common/autotest_common.sh@1543 -- # continue 00:05:08.481 11:40:58 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:08.481 11:40:58 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:08.481 11:40:58 -- common/autotest_common.sh@10 -- # set +x 00:05:08.481 11:40:58 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:08.481 11:40:58 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:08.481 11:40:58 -- common/autotest_common.sh@10 -- # set +x 00:05:08.481 11:40:58 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:12.675 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:12.675 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:15.211 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:05:17.749 11:41:07 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:17.749 11:41:07 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:17.749 11:41:07 -- common/autotest_common.sh@10 -- # set +x 00:05:17.749 11:41:07 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:17.749 11:41:07 -- common/autotest_common.sh@1577 -- # mapfile -t bdfs 00:05:17.749 11:41:07 -- common/autotest_common.sh@1577 -- # get_nvme_bdfs_by_id 0x0a54 00:05:17.749 11:41:07 -- common/autotest_common.sh@1563 -- # bdfs=() 00:05:17.749 11:41:07 -- common/autotest_common.sh@1563 -- # local bdfs 00:05:17.749 11:41:07 -- common/autotest_common.sh@1565 -- # get_nvme_bdfs 00:05:17.749 11:41:07 -- common/autotest_common.sh@1499 -- # bdfs=() 00:05:17.749 11:41:07 -- common/autotest_common.sh@1499 -- # local bdfs 00:05:17.749 11:41:07 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:17.749 11:41:07 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:17.749 11:41:07 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:05:17.749 11:41:07 -- common/autotest_common.sh@1501 -- # (( 1 == 0 )) 00:05:17.749 11:41:07 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:1a:00.0 00:05:17.749 11:41:07 -- common/autotest_common.sh@1565 -- # for bdf in $(get_nvme_bdfs) 00:05:17.749 11:41:07 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:1a:00.0/device 00:05:17.749 11:41:07 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:05:17.749 11:41:07 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:17.749 11:41:07 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:05:17.749 11:41:07 -- common/autotest_common.sh@1572 -- # printf '%s\n' 0000:1a:00.0 00:05:17.749 11:41:07 -- common/autotest_common.sh@1578 -- # [[ -z 0000:1a:00.0 ]] 00:05:17.749 11:41:07 -- common/autotest_common.sh@1583 -- # spdk_tgt_pid=336326 00:05:17.749 11:41:07 -- common/autotest_common.sh@1584 -- # waitforlisten 336326 00:05:17.749 11:41:07 -- common/autotest_common.sh@1582 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:17.749 11:41:07 -- common/autotest_common.sh@817 -- # '[' -z 336326 ']' 00:05:17.749 11:41:07 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:17.749 11:41:07 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:17.749 11:41:07 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:17.749 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:17.749 11:41:07 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:17.749 11:41:07 -- common/autotest_common.sh@10 -- # set +x 00:05:17.749 [2024-04-18 11:41:07.906917] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:05:17.749 [2024-04-18 11:41:07.907011] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid336326 ] 00:05:17.749 EAL: No free 2048 kB hugepages reported on node 1 00:05:17.749 [2024-04-18 11:41:08.052026] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:17.749 [2024-04-18 11:41:08.220695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:18.319 11:41:08 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:18.319 11:41:08 -- common/autotest_common.sh@850 -- # return 0 00:05:18.319 11:41:08 -- common/autotest_common.sh@1586 -- # bdf_id=0 00:05:18.319 11:41:08 -- common/autotest_common.sh@1587 -- # for bdf in "${bdfs[@]}" 00:05:18.319 11:41:08 -- common/autotest_common.sh@1588 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:1a:00.0 00:05:21.610 nvme0n1 00:05:21.610 11:41:11 -- common/autotest_common.sh@1590 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:21.610 [2024-04-18 11:41:12.032655] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:21.610 request: 00:05:21.610 { 00:05:21.610 "nvme_ctrlr_name": "nvme0", 00:05:21.610 "password": "test", 00:05:21.610 "method": "bdev_nvme_opal_revert", 00:05:21.610 "req_id": 1 00:05:21.610 } 00:05:21.610 Got JSON-RPC error response 00:05:21.610 response: 00:05:21.610 { 00:05:21.610 "code": -32602, 00:05:21.610 "message": "Invalid parameters" 00:05:21.610 } 00:05:21.610 11:41:12 -- common/autotest_common.sh@1590 -- # true 00:05:21.610 11:41:12 -- common/autotest_common.sh@1591 -- # (( ++bdf_id )) 00:05:21.610 11:41:12 -- common/autotest_common.sh@1594 -- # killprocess 336326 00:05:21.610 11:41:12 -- common/autotest_common.sh@936 -- # '[' -z 336326 ']' 00:05:21.610 11:41:12 -- common/autotest_common.sh@940 -- # kill -0 336326 00:05:21.610 11:41:12 -- common/autotest_common.sh@941 -- # uname 00:05:21.610 11:41:12 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:21.610 11:41:12 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 336326 00:05:21.610 11:41:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:21.610 11:41:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:21.610 11:41:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 336326' 00:05:21.610 killing process with pid 336326 00:05:21.610 11:41:12 -- common/autotest_common.sh@955 -- # kill 336326 00:05:21.610 11:41:12 -- common/autotest_common.sh@960 -- # wait 336326 00:05:26.884 11:41:17 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:26.884 11:41:17 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:26.884 11:41:17 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:26.884 11:41:17 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:26.884 11:41:17 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:26.884 11:41:17 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:26.884 11:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:26.884 11:41:17 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:26.884 11:41:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.884 11:41:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.884 11:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:26.884 ************************************ 00:05:26.884 START TEST env 00:05:26.884 ************************************ 00:05:26.884 11:41:17 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:27.150 * Looking for test storage... 00:05:27.150 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:27.150 11:41:17 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:27.150 11:41:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:27.150 11:41:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.150 11:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.150 ************************************ 00:05:27.150 START TEST env_memory 00:05:27.150 ************************************ 00:05:27.150 11:41:17 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:27.150 00:05:27.150 00:05:27.150 CUnit - A unit testing framework for C - Version 2.1-3 00:05:27.150 http://cunit.sourceforge.net/ 00:05:27.150 00:05:27.150 00:05:27.150 Suite: memory 00:05:27.150 Test: alloc and free memory map ...[2024-04-18 11:41:17.645194] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:27.150 passed 00:05:27.150 Test: mem map translation ...[2024-04-18 11:41:17.676354] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:27.150 [2024-04-18 11:41:17.676384] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:27.150 [2024-04-18 11:41:17.676447] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:27.150 [2024-04-18 11:41:17.676468] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:27.512 passed 00:05:27.512 Test: mem map registration ...[2024-04-18 11:41:17.721236] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:27.512 [2024-04-18 11:41:17.721265] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:27.512 passed 00:05:27.512 Test: mem map adjacent registrations ...passed 00:05:27.512 00:05:27.512 Run Summary: Type Total Ran Passed Failed Inactive 00:05:27.512 suites 1 1 n/a 0 0 00:05:27.512 tests 4 4 4 0 0 00:05:27.512 asserts 152 152 152 0 n/a 00:05:27.512 00:05:27.512 Elapsed time = 0.167 seconds 00:05:27.512 00:05:27.512 real 0m0.194s 00:05:27.512 user 0m0.173s 00:05:27.512 sys 0m0.020s 00:05:27.512 11:41:17 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:27.512 11:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.512 ************************************ 00:05:27.512 END TEST env_memory 00:05:27.512 ************************************ 00:05:27.512 11:41:17 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:27.512 11:41:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:27.512 11:41:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.512 11:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:27.512 ************************************ 00:05:27.512 START TEST env_vtophys 00:05:27.512 ************************************ 00:05:27.512 11:41:17 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:27.512 EAL: lib.eal log level changed from notice to debug 00:05:27.512 EAL: Detected lcore 0 as core 0 on socket 0 00:05:27.512 EAL: Detected lcore 1 as core 1 on socket 0 00:05:27.512 EAL: Detected lcore 2 as core 2 on socket 0 00:05:27.512 EAL: Detected lcore 3 as core 3 on socket 0 00:05:27.512 EAL: Detected lcore 4 as core 4 on socket 0 00:05:27.512 EAL: Detected lcore 5 as core 8 on socket 0 00:05:27.512 EAL: Detected lcore 6 as core 9 on socket 0 00:05:27.512 EAL: Detected lcore 7 as core 10 on socket 0 00:05:27.512 EAL: Detected lcore 8 as core 11 on socket 0 00:05:27.512 EAL: Detected lcore 9 as core 16 on socket 0 00:05:27.512 EAL: Detected lcore 10 as core 17 on socket 0 00:05:27.512 EAL: Detected lcore 11 as core 18 on socket 0 00:05:27.512 EAL: Detected lcore 12 as core 19 on socket 0 00:05:27.512 EAL: Detected lcore 13 as core 20 on socket 0 00:05:27.512 EAL: Detected lcore 14 as core 24 on socket 0 00:05:27.512 EAL: Detected lcore 15 as core 25 on socket 0 00:05:27.512 EAL: Detected lcore 16 as core 26 on socket 0 00:05:27.512 EAL: Detected lcore 17 as core 27 on socket 0 00:05:27.512 EAL: Detected lcore 18 as core 0 on socket 1 00:05:27.512 EAL: Detected lcore 19 as core 1 on socket 1 00:05:27.512 EAL: Detected lcore 20 as core 2 on socket 1 00:05:27.512 EAL: Detected lcore 21 as core 3 on socket 1 00:05:27.512 EAL: Detected lcore 22 as core 4 on socket 1 00:05:27.513 EAL: Detected lcore 23 as core 8 on socket 1 00:05:27.513 EAL: Detected lcore 24 as core 9 on socket 1 00:05:27.513 EAL: Detected lcore 25 as core 10 on socket 1 00:05:27.513 EAL: Detected lcore 26 as core 11 on socket 1 00:05:27.513 EAL: Detected lcore 27 as core 16 on socket 1 00:05:27.513 EAL: Detected lcore 28 as core 17 on socket 1 00:05:27.513 EAL: Detected lcore 29 as core 18 on socket 1 00:05:27.513 EAL: Detected lcore 30 as core 19 on socket 1 00:05:27.513 EAL: Detected lcore 31 as core 20 on socket 1 00:05:27.513 EAL: Detected lcore 32 as core 24 on socket 1 00:05:27.513 EAL: Detected lcore 33 as core 25 on socket 1 00:05:27.513 EAL: Detected lcore 34 as core 26 on socket 1 00:05:27.513 EAL: Detected lcore 35 as core 27 on socket 1 00:05:27.513 EAL: Detected lcore 36 as core 0 on socket 0 00:05:27.513 EAL: Detected lcore 37 as core 1 on socket 0 00:05:27.513 EAL: Detected lcore 38 as core 2 on socket 0 00:05:27.513 EAL: Detected lcore 39 as core 3 on socket 0 00:05:27.513 EAL: Detected lcore 40 as core 4 on socket 0 00:05:27.513 EAL: Detected lcore 41 as core 8 on socket 0 00:05:27.513 EAL: Detected lcore 42 as core 9 on socket 0 00:05:27.513 EAL: Detected lcore 43 as core 10 on socket 0 00:05:27.513 EAL: Detected lcore 44 as core 11 on socket 0 00:05:27.513 EAL: Detected lcore 45 as core 16 on socket 0 00:05:27.513 EAL: Detected lcore 46 as core 17 on socket 0 00:05:27.513 EAL: Detected lcore 47 as core 18 on socket 0 00:05:27.513 EAL: Detected lcore 48 as core 19 on socket 0 00:05:27.513 EAL: Detected lcore 49 as core 20 on socket 0 00:05:27.513 EAL: Detected lcore 50 as core 24 on socket 0 00:05:27.513 EAL: Detected lcore 51 as core 25 on socket 0 00:05:27.513 EAL: Detected lcore 52 as core 26 on socket 0 00:05:27.513 EAL: Detected lcore 53 as core 27 on socket 0 00:05:27.513 EAL: Detected lcore 54 as core 0 on socket 1 00:05:27.513 EAL: Detected lcore 55 as core 1 on socket 1 00:05:27.513 EAL: Detected lcore 56 as core 2 on socket 1 00:05:27.513 EAL: Detected lcore 57 as core 3 on socket 1 00:05:27.513 EAL: Detected lcore 58 as core 4 on socket 1 00:05:27.513 EAL: Detected lcore 59 as core 8 on socket 1 00:05:27.513 EAL: Detected lcore 60 as core 9 on socket 1 00:05:27.513 EAL: Detected lcore 61 as core 10 on socket 1 00:05:27.513 EAL: Detected lcore 62 as core 11 on socket 1 00:05:27.513 EAL: Detected lcore 63 as core 16 on socket 1 00:05:27.513 EAL: Detected lcore 64 as core 17 on socket 1 00:05:27.513 EAL: Detected lcore 65 as core 18 on socket 1 00:05:27.513 EAL: Detected lcore 66 as core 19 on socket 1 00:05:27.513 EAL: Detected lcore 67 as core 20 on socket 1 00:05:27.513 EAL: Detected lcore 68 as core 24 on socket 1 00:05:27.513 EAL: Detected lcore 69 as core 25 on socket 1 00:05:27.513 EAL: Detected lcore 70 as core 26 on socket 1 00:05:27.513 EAL: Detected lcore 71 as core 27 on socket 1 00:05:27.513 EAL: Maximum logical cores by configuration: 128 00:05:27.513 EAL: Detected CPU lcores: 72 00:05:27.513 EAL: Detected NUMA nodes: 2 00:05:27.513 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:27.513 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:27.513 EAL: Checking presence of .so 'librte_eal.so' 00:05:27.513 EAL: Detected static linkage of DPDK 00:05:27.513 EAL: No shared files mode enabled, IPC will be disabled 00:05:27.772 EAL: Bus pci wants IOVA as 'DC' 00:05:27.772 EAL: Buses did not request a specific IOVA mode. 00:05:27.772 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:27.772 EAL: Selected IOVA mode 'VA' 00:05:27.772 EAL: No free 2048 kB hugepages reported on node 1 00:05:27.772 EAL: Probing VFIO support... 00:05:27.772 EAL: IOMMU type 1 (Type 1) is supported 00:05:27.772 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:27.772 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:27.772 EAL: VFIO support initialized 00:05:27.772 EAL: Ask a virtual area of 0x2e000 bytes 00:05:27.772 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:27.772 EAL: Setting up physically contiguous memory... 00:05:27.772 EAL: Setting maximum number of open files to 524288 00:05:27.772 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:27.772 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:27.772 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:27.772 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:27.772 EAL: Ask a virtual area of 0x61000 bytes 00:05:27.772 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:27.772 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:27.772 EAL: Ask a virtual area of 0x400000000 bytes 00:05:27.772 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:27.772 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:27.772 EAL: Hugepages will be freed exactly as allocated. 00:05:27.772 EAL: No shared files mode enabled, IPC is disabled 00:05:27.772 EAL: No shared files mode enabled, IPC is disabled 00:05:27.772 EAL: TSC frequency is ~2300000 KHz 00:05:27.772 EAL: Main lcore 0 is ready (tid=7f02ad04ba40;cpuset=[0]) 00:05:27.772 EAL: Trying to obtain current memory policy. 00:05:27.772 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:27.772 EAL: Restoring previous memory policy: 0 00:05:27.772 EAL: request: mp_malloc_sync 00:05:27.772 EAL: No shared files mode enabled, IPC is disabled 00:05:27.772 EAL: Heap on socket 0 was expanded by 2MB 00:05:27.772 EAL: No shared files mode enabled, IPC is disabled 00:05:27.772 EAL: Mem event callback 'spdk:(nil)' registered 00:05:27.772 00:05:27.772 00:05:27.772 CUnit - A unit testing framework for C - Version 2.1-3 00:05:27.772 http://cunit.sourceforge.net/ 00:05:27.772 00:05:27.772 00:05:27.772 Suite: components_suite 00:05:28.032 Test: vtophys_malloc_test ...passed 00:05:28.032 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:28.032 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.032 EAL: Restoring previous memory policy: 4 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was expanded by 4MB 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was shrunk by 4MB 00:05:28.032 EAL: Trying to obtain current memory policy. 00:05:28.032 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.032 EAL: Restoring previous memory policy: 4 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was expanded by 6MB 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was shrunk by 6MB 00:05:28.032 EAL: Trying to obtain current memory policy. 00:05:28.032 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.032 EAL: Restoring previous memory policy: 4 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was expanded by 10MB 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was shrunk by 10MB 00:05:28.032 EAL: Trying to obtain current memory policy. 00:05:28.032 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.032 EAL: Restoring previous memory policy: 4 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was expanded by 18MB 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was shrunk by 18MB 00:05:28.032 EAL: Trying to obtain current memory policy. 00:05:28.032 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.032 EAL: Restoring previous memory policy: 4 00:05:28.032 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.032 EAL: request: mp_malloc_sync 00:05:28.032 EAL: No shared files mode enabled, IPC is disabled 00:05:28.032 EAL: Heap on socket 0 was expanded by 34MB 00:05:28.292 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.292 EAL: request: mp_malloc_sync 00:05:28.292 EAL: No shared files mode enabled, IPC is disabled 00:05:28.292 EAL: Heap on socket 0 was shrunk by 34MB 00:05:28.292 EAL: Trying to obtain current memory policy. 00:05:28.292 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.292 EAL: Restoring previous memory policy: 4 00:05:28.292 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.292 EAL: request: mp_malloc_sync 00:05:28.292 EAL: No shared files mode enabled, IPC is disabled 00:05:28.292 EAL: Heap on socket 0 was expanded by 66MB 00:05:28.292 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.292 EAL: request: mp_malloc_sync 00:05:28.292 EAL: No shared files mode enabled, IPC is disabled 00:05:28.292 EAL: Heap on socket 0 was shrunk by 66MB 00:05:28.292 EAL: Trying to obtain current memory policy. 00:05:28.292 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.292 EAL: Restoring previous memory policy: 4 00:05:28.292 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.292 EAL: request: mp_malloc_sync 00:05:28.292 EAL: No shared files mode enabled, IPC is disabled 00:05:28.292 EAL: Heap on socket 0 was expanded by 130MB 00:05:28.551 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.551 EAL: request: mp_malloc_sync 00:05:28.551 EAL: No shared files mode enabled, IPC is disabled 00:05:28.551 EAL: Heap on socket 0 was shrunk by 130MB 00:05:28.810 EAL: Trying to obtain current memory policy. 00:05:28.810 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:28.810 EAL: Restoring previous memory policy: 4 00:05:28.810 EAL: Calling mem event callback 'spdk:(nil)' 00:05:28.810 EAL: request: mp_malloc_sync 00:05:28.810 EAL: No shared files mode enabled, IPC is disabled 00:05:28.810 EAL: Heap on socket 0 was expanded by 258MB 00:05:29.070 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.070 EAL: request: mp_malloc_sync 00:05:29.070 EAL: No shared files mode enabled, IPC is disabled 00:05:29.070 EAL: Heap on socket 0 was shrunk by 258MB 00:05:29.329 EAL: Trying to obtain current memory policy. 00:05:29.329 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:29.329 EAL: Restoring previous memory policy: 4 00:05:29.329 EAL: Calling mem event callback 'spdk:(nil)' 00:05:29.329 EAL: request: mp_malloc_sync 00:05:29.329 EAL: No shared files mode enabled, IPC is disabled 00:05:29.329 EAL: Heap on socket 0 was expanded by 514MB 00:05:29.898 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.157 EAL: request: mp_malloc_sync 00:05:30.157 EAL: No shared files mode enabled, IPC is disabled 00:05:30.157 EAL: Heap on socket 0 was shrunk by 514MB 00:05:30.726 EAL: Trying to obtain current memory policy. 00:05:30.726 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:30.726 EAL: Restoring previous memory policy: 4 00:05:30.726 EAL: Calling mem event callback 'spdk:(nil)' 00:05:30.726 EAL: request: mp_malloc_sync 00:05:30.726 EAL: No shared files mode enabled, IPC is disabled 00:05:30.726 EAL: Heap on socket 0 was expanded by 1026MB 00:05:32.105 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.105 EAL: request: mp_malloc_sync 00:05:32.105 EAL: No shared files mode enabled, IPC is disabled 00:05:32.105 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:33.042 passed 00:05:33.042 00:05:33.042 Run Summary: Type Total Ran Passed Failed Inactive 00:05:33.042 suites 1 1 n/a 0 0 00:05:33.042 tests 2 2 2 0 0 00:05:33.042 asserts 497 497 497 0 n/a 00:05:33.042 00:05:33.042 Elapsed time = 5.356 seconds 00:05:33.042 EAL: Calling mem event callback 'spdk:(nil)' 00:05:33.042 EAL: request: mp_malloc_sync 00:05:33.042 EAL: No shared files mode enabled, IPC is disabled 00:05:33.042 EAL: Heap on socket 0 was shrunk by 2MB 00:05:33.042 EAL: No shared files mode enabled, IPC is disabled 00:05:33.042 EAL: No shared files mode enabled, IPC is disabled 00:05:33.042 EAL: No shared files mode enabled, IPC is disabled 00:05:33.042 00:05:33.042 real 0m5.604s 00:05:33.042 user 0m4.638s 00:05:33.042 sys 0m0.917s 00:05:33.042 11:41:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:33.042 11:41:23 -- common/autotest_common.sh@10 -- # set +x 00:05:33.042 ************************************ 00:05:33.042 END TEST env_vtophys 00:05:33.042 ************************************ 00:05:33.301 11:41:23 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:33.301 11:41:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:33.301 11:41:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.301 11:41:23 -- common/autotest_common.sh@10 -- # set +x 00:05:33.301 ************************************ 00:05:33.301 START TEST env_pci 00:05:33.301 ************************************ 00:05:33.301 11:41:23 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:33.301 00:05:33.301 00:05:33.301 CUnit - A unit testing framework for C - Version 2.1-3 00:05:33.301 http://cunit.sourceforge.net/ 00:05:33.301 00:05:33.301 00:05:33.301 Suite: pci 00:05:33.301 Test: pci_hook ...[2024-04-18 11:41:23.794987] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 338547 has claimed it 00:05:33.559 EAL: Cannot find device (10000:00:01.0) 00:05:33.559 EAL: Failed to attach device on primary process 00:05:33.559 passed 00:05:33.559 00:05:33.559 Run Summary: Type Total Ran Passed Failed Inactive 00:05:33.559 suites 1 1 n/a 0 0 00:05:33.559 tests 1 1 1 0 0 00:05:33.560 asserts 25 25 25 0 n/a 00:05:33.560 00:05:33.560 Elapsed time = 0.064 seconds 00:05:33.560 00:05:33.560 real 0m0.138s 00:05:33.560 user 0m0.051s 00:05:33.560 sys 0m0.086s 00:05:33.560 11:41:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:33.560 11:41:23 -- common/autotest_common.sh@10 -- # set +x 00:05:33.560 ************************************ 00:05:33.560 END TEST env_pci 00:05:33.560 ************************************ 00:05:33.560 11:41:23 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:33.560 11:41:23 -- env/env.sh@15 -- # uname 00:05:33.560 11:41:23 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:33.560 11:41:23 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:33.560 11:41:23 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:33.560 11:41:23 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:33.560 11:41:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.560 11:41:23 -- common/autotest_common.sh@10 -- # set +x 00:05:33.560 ************************************ 00:05:33.560 START TEST env_dpdk_post_init 00:05:33.560 ************************************ 00:05:33.560 11:41:24 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:33.818 EAL: Detected CPU lcores: 72 00:05:33.818 EAL: Detected NUMA nodes: 2 00:05:33.818 EAL: Detected static linkage of DPDK 00:05:33.818 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:33.818 EAL: Selected IOVA mode 'VA' 00:05:33.818 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.818 EAL: VFIO support initialized 00:05:33.818 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:33.818 EAL: Using IOMMU type 1 (Type 1) 00:05:34.755 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:1a:00.0 (socket 0) 00:05:40.023 EAL: Releasing PCI mapped resource for 0000:1a:00.0 00:05:40.023 EAL: Calling pci_unmap_resource for 0000:1a:00.0 at 0x202001000000 00:05:40.281 Starting DPDK initialization... 00:05:40.281 Starting SPDK post initialization... 00:05:40.281 SPDK NVMe probe 00:05:40.281 Attaching to 0000:1a:00.0 00:05:40.281 Attached to 0000:1a:00.0 00:05:40.281 Cleaning up... 00:05:40.281 00:05:40.281 real 0m6.612s 00:05:40.281 user 0m5.112s 00:05:40.281 sys 0m0.749s 00:05:40.281 11:41:30 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:40.281 11:41:30 -- common/autotest_common.sh@10 -- # set +x 00:05:40.281 ************************************ 00:05:40.281 END TEST env_dpdk_post_init 00:05:40.281 ************************************ 00:05:40.281 11:41:30 -- env/env.sh@26 -- # uname 00:05:40.281 11:41:30 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:40.281 11:41:30 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:40.281 11:41:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:40.281 11:41:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.281 11:41:30 -- common/autotest_common.sh@10 -- # set +x 00:05:40.541 ************************************ 00:05:40.541 START TEST env_mem_callbacks 00:05:40.541 ************************************ 00:05:40.541 11:41:30 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:40.541 EAL: Detected CPU lcores: 72 00:05:40.541 EAL: Detected NUMA nodes: 2 00:05:40.541 EAL: Detected static linkage of DPDK 00:05:40.541 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:40.541 EAL: Selected IOVA mode 'VA' 00:05:40.541 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.541 EAL: VFIO support initialized 00:05:40.541 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:40.541 00:05:40.541 00:05:40.541 CUnit - A unit testing framework for C - Version 2.1-3 00:05:40.541 http://cunit.sourceforge.net/ 00:05:40.541 00:05:40.541 00:05:40.541 Suite: memory 00:05:40.541 Test: test ... 00:05:40.541 register 0x200000200000 2097152 00:05:40.541 malloc 3145728 00:05:40.541 register 0x200000400000 4194304 00:05:40.541 buf 0x2000004fffc0 len 3145728 PASSED 00:05:40.541 malloc 64 00:05:40.541 buf 0x2000004ffec0 len 64 PASSED 00:05:40.541 malloc 4194304 00:05:40.541 register 0x200000800000 6291456 00:05:40.541 buf 0x2000009fffc0 len 4194304 PASSED 00:05:40.541 free 0x2000004fffc0 3145728 00:05:40.541 free 0x2000004ffec0 64 00:05:40.541 unregister 0x200000400000 4194304 PASSED 00:05:40.541 free 0x2000009fffc0 4194304 00:05:40.541 unregister 0x200000800000 6291456 PASSED 00:05:40.541 malloc 8388608 00:05:40.541 register 0x200000400000 10485760 00:05:40.541 buf 0x2000005fffc0 len 8388608 PASSED 00:05:40.541 free 0x2000005fffc0 8388608 00:05:40.541 unregister 0x200000400000 10485760 PASSED 00:05:40.541 passed 00:05:40.541 00:05:40.541 Run Summary: Type Total Ran Passed Failed Inactive 00:05:40.541 suites 1 1 n/a 0 0 00:05:40.541 tests 1 1 1 0 0 00:05:40.541 asserts 15 15 15 0 n/a 00:05:40.541 00:05:40.541 Elapsed time = 0.037 seconds 00:05:40.541 00:05:40.541 real 0m0.153s 00:05:40.541 user 0m0.074s 00:05:40.541 sys 0m0.078s 00:05:40.541 11:41:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:40.541 11:41:31 -- common/autotest_common.sh@10 -- # set +x 00:05:40.541 ************************************ 00:05:40.541 END TEST env_mem_callbacks 00:05:40.541 ************************************ 00:05:40.541 00:05:40.541 real 0m13.720s 00:05:40.541 user 0m10.392s 00:05:40.541 sys 0m2.444s 00:05:40.541 11:41:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:40.541 11:41:31 -- common/autotest_common.sh@10 -- # set +x 00:05:40.541 ************************************ 00:05:40.541 END TEST env 00:05:40.541 ************************************ 00:05:40.799 11:41:31 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:40.799 11:41:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:40.799 11:41:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.799 11:41:31 -- common/autotest_common.sh@10 -- # set +x 00:05:40.799 ************************************ 00:05:40.799 START TEST rpc 00:05:40.799 ************************************ 00:05:40.799 11:41:31 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:41.058 * Looking for test storage... 00:05:41.058 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:41.058 11:41:31 -- rpc/rpc.sh@65 -- # spdk_pid=339598 00:05:41.058 11:41:31 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:41.058 11:41:31 -- rpc/rpc.sh@67 -- # waitforlisten 339598 00:05:41.058 11:41:31 -- common/autotest_common.sh@817 -- # '[' -z 339598 ']' 00:05:41.058 11:41:31 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:41.058 11:41:31 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:41.058 11:41:31 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:41.058 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:41.058 11:41:31 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:41.058 11:41:31 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:41.058 11:41:31 -- common/autotest_common.sh@10 -- # set +x 00:05:41.058 [2024-04-18 11:41:31.427234] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:05:41.058 [2024-04-18 11:41:31.427329] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid339598 ] 00:05:41.058 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.058 [2024-04-18 11:41:31.574013] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.317 [2024-04-18 11:41:31.743096] app.c: 523:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:41.317 [2024-04-18 11:41:31.743149] app.c: 527:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 339598' to capture a snapshot of events at runtime. 00:05:41.317 [2024-04-18 11:41:31.743164] app.c: 529:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:41.317 [2024-04-18 11:41:31.743175] app.c: 530:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:41.317 [2024-04-18 11:41:31.743186] app.c: 531:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid339598 for offline analysis/debug. 00:05:41.317 [2024-04-18 11:41:31.743216] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.883 11:41:32 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:41.883 11:41:32 -- common/autotest_common.sh@850 -- # return 0 00:05:41.883 11:41:32 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:41.883 11:41:32 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:41.883 11:41:32 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:41.883 11:41:32 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:41.883 11:41:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:41.883 11:41:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:41.883 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.141 ************************************ 00:05:42.141 START TEST rpc_integrity 00:05:42.141 ************************************ 00:05:42.141 11:41:32 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:05:42.141 11:41:32 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:42.141 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.141 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.141 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.141 11:41:32 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:42.141 11:41:32 -- rpc/rpc.sh@13 -- # jq length 00:05:42.141 11:41:32 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:42.141 11:41:32 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:42.141 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.141 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.141 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.141 11:41:32 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:42.141 11:41:32 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:42.141 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.141 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.141 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.141 11:41:32 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:42.141 { 00:05:42.141 "name": "Malloc0", 00:05:42.141 "aliases": [ 00:05:42.141 "7f002f5a-22bf-4c10-819c-9427d41fad31" 00:05:42.141 ], 00:05:42.141 "product_name": "Malloc disk", 00:05:42.141 "block_size": 512, 00:05:42.141 "num_blocks": 16384, 00:05:42.141 "uuid": "7f002f5a-22bf-4c10-819c-9427d41fad31", 00:05:42.141 "assigned_rate_limits": { 00:05:42.141 "rw_ios_per_sec": 0, 00:05:42.141 "rw_mbytes_per_sec": 0, 00:05:42.141 "r_mbytes_per_sec": 0, 00:05:42.141 "w_mbytes_per_sec": 0 00:05:42.141 }, 00:05:42.141 "claimed": false, 00:05:42.141 "zoned": false, 00:05:42.141 "supported_io_types": { 00:05:42.141 "read": true, 00:05:42.141 "write": true, 00:05:42.141 "unmap": true, 00:05:42.141 "write_zeroes": true, 00:05:42.141 "flush": true, 00:05:42.141 "reset": true, 00:05:42.141 "compare": false, 00:05:42.141 "compare_and_write": false, 00:05:42.141 "abort": true, 00:05:42.141 "nvme_admin": false, 00:05:42.141 "nvme_io": false 00:05:42.141 }, 00:05:42.141 "memory_domains": [ 00:05:42.141 { 00:05:42.141 "dma_device_id": "system", 00:05:42.141 "dma_device_type": 1 00:05:42.141 }, 00:05:42.141 { 00:05:42.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.141 "dma_device_type": 2 00:05:42.141 } 00:05:42.141 ], 00:05:42.141 "driver_specific": {} 00:05:42.141 } 00:05:42.141 ]' 00:05:42.141 11:41:32 -- rpc/rpc.sh@17 -- # jq length 00:05:42.141 11:41:32 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:42.141 11:41:32 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:42.141 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.141 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.141 [2024-04-18 11:41:32.628662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:42.141 [2024-04-18 11:41:32.628710] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:42.141 [2024-04-18 11:41:32.628745] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000026780 00:05:42.141 [2024-04-18 11:41:32.628759] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:42.141 [2024-04-18 11:41:32.630739] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:42.141 [2024-04-18 11:41:32.630768] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:42.141 Passthru0 00:05:42.141 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.141 11:41:32 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:42.141 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.141 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.141 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.141 11:41:32 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:42.141 { 00:05:42.141 "name": "Malloc0", 00:05:42.141 "aliases": [ 00:05:42.141 "7f002f5a-22bf-4c10-819c-9427d41fad31" 00:05:42.141 ], 00:05:42.141 "product_name": "Malloc disk", 00:05:42.141 "block_size": 512, 00:05:42.141 "num_blocks": 16384, 00:05:42.141 "uuid": "7f002f5a-22bf-4c10-819c-9427d41fad31", 00:05:42.141 "assigned_rate_limits": { 00:05:42.141 "rw_ios_per_sec": 0, 00:05:42.141 "rw_mbytes_per_sec": 0, 00:05:42.141 "r_mbytes_per_sec": 0, 00:05:42.141 "w_mbytes_per_sec": 0 00:05:42.141 }, 00:05:42.141 "claimed": true, 00:05:42.141 "claim_type": "exclusive_write", 00:05:42.141 "zoned": false, 00:05:42.141 "supported_io_types": { 00:05:42.141 "read": true, 00:05:42.141 "write": true, 00:05:42.141 "unmap": true, 00:05:42.141 "write_zeroes": true, 00:05:42.141 "flush": true, 00:05:42.141 "reset": true, 00:05:42.141 "compare": false, 00:05:42.141 "compare_and_write": false, 00:05:42.141 "abort": true, 00:05:42.141 "nvme_admin": false, 00:05:42.141 "nvme_io": false 00:05:42.141 }, 00:05:42.141 "memory_domains": [ 00:05:42.141 { 00:05:42.141 "dma_device_id": "system", 00:05:42.141 "dma_device_type": 1 00:05:42.141 }, 00:05:42.141 { 00:05:42.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.141 "dma_device_type": 2 00:05:42.141 } 00:05:42.141 ], 00:05:42.141 "driver_specific": {} 00:05:42.141 }, 00:05:42.141 { 00:05:42.141 "name": "Passthru0", 00:05:42.141 "aliases": [ 00:05:42.141 "9938b9cc-ec9b-548c-88b7-968c1b80b6d8" 00:05:42.141 ], 00:05:42.141 "product_name": "passthru", 00:05:42.141 "block_size": 512, 00:05:42.141 "num_blocks": 16384, 00:05:42.141 "uuid": "9938b9cc-ec9b-548c-88b7-968c1b80b6d8", 00:05:42.141 "assigned_rate_limits": { 00:05:42.141 "rw_ios_per_sec": 0, 00:05:42.141 "rw_mbytes_per_sec": 0, 00:05:42.141 "r_mbytes_per_sec": 0, 00:05:42.141 "w_mbytes_per_sec": 0 00:05:42.141 }, 00:05:42.141 "claimed": false, 00:05:42.141 "zoned": false, 00:05:42.141 "supported_io_types": { 00:05:42.141 "read": true, 00:05:42.141 "write": true, 00:05:42.141 "unmap": true, 00:05:42.141 "write_zeroes": true, 00:05:42.141 "flush": true, 00:05:42.141 "reset": true, 00:05:42.141 "compare": false, 00:05:42.141 "compare_and_write": false, 00:05:42.141 "abort": true, 00:05:42.141 "nvme_admin": false, 00:05:42.141 "nvme_io": false 00:05:42.141 }, 00:05:42.141 "memory_domains": [ 00:05:42.141 { 00:05:42.141 "dma_device_id": "system", 00:05:42.141 "dma_device_type": 1 00:05:42.141 }, 00:05:42.141 { 00:05:42.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.141 "dma_device_type": 2 00:05:42.141 } 00:05:42.141 ], 00:05:42.141 "driver_specific": { 00:05:42.141 "passthru": { 00:05:42.141 "name": "Passthru0", 00:05:42.141 "base_bdev_name": "Malloc0" 00:05:42.141 } 00:05:42.141 } 00:05:42.141 } 00:05:42.141 ]' 00:05:42.141 11:41:32 -- rpc/rpc.sh@21 -- # jq length 00:05:42.400 11:41:32 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:42.400 11:41:32 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:42.400 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.400 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.400 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.400 11:41:32 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:42.400 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.400 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.400 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.400 11:41:32 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:42.400 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.400 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.400 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.400 11:41:32 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:42.400 11:41:32 -- rpc/rpc.sh@26 -- # jq length 00:05:42.400 11:41:32 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:42.400 00:05:42.400 real 0m0.304s 00:05:42.400 user 0m0.176s 00:05:42.400 sys 0m0.048s 00:05:42.400 11:41:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:42.400 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.400 ************************************ 00:05:42.400 END TEST rpc_integrity 00:05:42.400 ************************************ 00:05:42.400 11:41:32 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:42.400 11:41:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.400 11:41:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.400 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.657 ************************************ 00:05:42.657 START TEST rpc_plugins 00:05:42.657 ************************************ 00:05:42.657 11:41:32 -- common/autotest_common.sh@1111 -- # rpc_plugins 00:05:42.657 11:41:32 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:42.657 11:41:32 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.657 11:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:42.657 11:41:32 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.657 11:41:32 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:42.657 11:41:32 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:42.657 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.657 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:42.657 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.658 11:41:33 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:42.658 { 00:05:42.658 "name": "Malloc1", 00:05:42.658 "aliases": [ 00:05:42.658 "2de22f2d-6e12-45c3-922d-4c018ff3e583" 00:05:42.658 ], 00:05:42.658 "product_name": "Malloc disk", 00:05:42.658 "block_size": 4096, 00:05:42.658 "num_blocks": 256, 00:05:42.658 "uuid": "2de22f2d-6e12-45c3-922d-4c018ff3e583", 00:05:42.658 "assigned_rate_limits": { 00:05:42.658 "rw_ios_per_sec": 0, 00:05:42.658 "rw_mbytes_per_sec": 0, 00:05:42.658 "r_mbytes_per_sec": 0, 00:05:42.658 "w_mbytes_per_sec": 0 00:05:42.658 }, 00:05:42.658 "claimed": false, 00:05:42.658 "zoned": false, 00:05:42.658 "supported_io_types": { 00:05:42.658 "read": true, 00:05:42.658 "write": true, 00:05:42.658 "unmap": true, 00:05:42.658 "write_zeroes": true, 00:05:42.658 "flush": true, 00:05:42.658 "reset": true, 00:05:42.658 "compare": false, 00:05:42.658 "compare_and_write": false, 00:05:42.658 "abort": true, 00:05:42.658 "nvme_admin": false, 00:05:42.658 "nvme_io": false 00:05:42.658 }, 00:05:42.658 "memory_domains": [ 00:05:42.658 { 00:05:42.658 "dma_device_id": "system", 00:05:42.658 "dma_device_type": 1 00:05:42.658 }, 00:05:42.658 { 00:05:42.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:42.658 "dma_device_type": 2 00:05:42.658 } 00:05:42.658 ], 00:05:42.658 "driver_specific": {} 00:05:42.658 } 00:05:42.658 ]' 00:05:42.658 11:41:33 -- rpc/rpc.sh@32 -- # jq length 00:05:42.658 11:41:33 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:42.658 11:41:33 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:42.658 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.658 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:42.658 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.658 11:41:33 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:42.658 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.658 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:42.658 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.658 11:41:33 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:42.658 11:41:33 -- rpc/rpc.sh@36 -- # jq length 00:05:42.658 11:41:33 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:42.658 00:05:42.658 real 0m0.145s 00:05:42.658 user 0m0.090s 00:05:42.658 sys 0m0.023s 00:05:42.658 11:41:33 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:42.658 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:42.658 ************************************ 00:05:42.658 END TEST rpc_plugins 00:05:42.658 ************************************ 00:05:42.658 11:41:33 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:42.658 11:41:33 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.658 11:41:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.658 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:42.915 ************************************ 00:05:42.915 START TEST rpc_trace_cmd_test 00:05:42.915 ************************************ 00:05:42.915 11:41:33 -- common/autotest_common.sh@1111 -- # rpc_trace_cmd_test 00:05:42.915 11:41:33 -- rpc/rpc.sh@40 -- # local info 00:05:42.915 11:41:33 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:42.915 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:42.915 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:42.915 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:42.915 11:41:33 -- rpc/rpc.sh@42 -- # info='{ 00:05:42.916 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid339598", 00:05:42.916 "tpoint_group_mask": "0x8", 00:05:42.916 "iscsi_conn": { 00:05:42.916 "mask": "0x2", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "scsi": { 00:05:42.916 "mask": "0x4", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "bdev": { 00:05:42.916 "mask": "0x8", 00:05:42.916 "tpoint_mask": "0xffffffffffffffff" 00:05:42.916 }, 00:05:42.916 "nvmf_rdma": { 00:05:42.916 "mask": "0x10", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "nvmf_tcp": { 00:05:42.916 "mask": "0x20", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "ftl": { 00:05:42.916 "mask": "0x40", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "blobfs": { 00:05:42.916 "mask": "0x80", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "dsa": { 00:05:42.916 "mask": "0x200", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "thread": { 00:05:42.916 "mask": "0x400", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "nvme_pcie": { 00:05:42.916 "mask": "0x800", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "iaa": { 00:05:42.916 "mask": "0x1000", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "nvme_tcp": { 00:05:42.916 "mask": "0x2000", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "bdev_nvme": { 00:05:42.916 "mask": "0x4000", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 }, 00:05:42.916 "sock": { 00:05:42.916 "mask": "0x8000", 00:05:42.916 "tpoint_mask": "0x0" 00:05:42.916 } 00:05:42.916 }' 00:05:42.916 11:41:33 -- rpc/rpc.sh@43 -- # jq length 00:05:42.916 11:41:33 -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:42.916 11:41:33 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:42.916 11:41:33 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:42.916 11:41:33 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:42.916 11:41:33 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:42.916 11:41:33 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:43.174 11:41:33 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:43.174 11:41:33 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:43.174 11:41:33 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:43.174 00:05:43.174 real 0m0.221s 00:05:43.174 user 0m0.178s 00:05:43.174 sys 0m0.033s 00:05:43.174 11:41:33 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:43.174 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.174 ************************************ 00:05:43.174 END TEST rpc_trace_cmd_test 00:05:43.174 ************************************ 00:05:43.174 11:41:33 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:43.174 11:41:33 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:43.174 11:41:33 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:43.174 11:41:33 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.174 11:41:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.174 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.433 ************************************ 00:05:43.433 START TEST rpc_daemon_integrity 00:05:43.433 ************************************ 00:05:43.433 11:41:33 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:05:43.433 11:41:33 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:43.433 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.433 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.433 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.433 11:41:33 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:43.433 11:41:33 -- rpc/rpc.sh@13 -- # jq length 00:05:43.433 11:41:33 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:43.433 11:41:33 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:43.433 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.433 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.433 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.433 11:41:33 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:43.433 11:41:33 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:43.433 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.433 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.433 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.433 11:41:33 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:43.433 { 00:05:43.433 "name": "Malloc2", 00:05:43.433 "aliases": [ 00:05:43.433 "13a7f5f7-d8ae-4848-8d00-1eab8396973d" 00:05:43.433 ], 00:05:43.433 "product_name": "Malloc disk", 00:05:43.433 "block_size": 512, 00:05:43.433 "num_blocks": 16384, 00:05:43.433 "uuid": "13a7f5f7-d8ae-4848-8d00-1eab8396973d", 00:05:43.433 "assigned_rate_limits": { 00:05:43.433 "rw_ios_per_sec": 0, 00:05:43.433 "rw_mbytes_per_sec": 0, 00:05:43.433 "r_mbytes_per_sec": 0, 00:05:43.433 "w_mbytes_per_sec": 0 00:05:43.433 }, 00:05:43.433 "claimed": false, 00:05:43.433 "zoned": false, 00:05:43.433 "supported_io_types": { 00:05:43.433 "read": true, 00:05:43.433 "write": true, 00:05:43.433 "unmap": true, 00:05:43.433 "write_zeroes": true, 00:05:43.433 "flush": true, 00:05:43.433 "reset": true, 00:05:43.433 "compare": false, 00:05:43.433 "compare_and_write": false, 00:05:43.433 "abort": true, 00:05:43.433 "nvme_admin": false, 00:05:43.433 "nvme_io": false 00:05:43.433 }, 00:05:43.433 "memory_domains": [ 00:05:43.433 { 00:05:43.433 "dma_device_id": "system", 00:05:43.433 "dma_device_type": 1 00:05:43.433 }, 00:05:43.433 { 00:05:43.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.433 "dma_device_type": 2 00:05:43.433 } 00:05:43.433 ], 00:05:43.433 "driver_specific": {} 00:05:43.433 } 00:05:43.433 ]' 00:05:43.433 11:41:33 -- rpc/rpc.sh@17 -- # jq length 00:05:43.433 11:41:33 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:43.433 11:41:33 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:43.433 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.433 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.433 [2024-04-18 11:41:33.872552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:43.433 [2024-04-18 11:41:33.872594] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:43.433 [2024-04-18 11:41:33.872625] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000027980 00:05:43.433 [2024-04-18 11:41:33.872638] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:43.433 [2024-04-18 11:41:33.874534] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:43.433 [2024-04-18 11:41:33.874561] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:43.433 Passthru0 00:05:43.433 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.433 11:41:33 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:43.433 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.433 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.433 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.433 11:41:33 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:43.433 { 00:05:43.433 "name": "Malloc2", 00:05:43.433 "aliases": [ 00:05:43.433 "13a7f5f7-d8ae-4848-8d00-1eab8396973d" 00:05:43.433 ], 00:05:43.433 "product_name": "Malloc disk", 00:05:43.433 "block_size": 512, 00:05:43.433 "num_blocks": 16384, 00:05:43.433 "uuid": "13a7f5f7-d8ae-4848-8d00-1eab8396973d", 00:05:43.433 "assigned_rate_limits": { 00:05:43.433 "rw_ios_per_sec": 0, 00:05:43.433 "rw_mbytes_per_sec": 0, 00:05:43.433 "r_mbytes_per_sec": 0, 00:05:43.433 "w_mbytes_per_sec": 0 00:05:43.433 }, 00:05:43.433 "claimed": true, 00:05:43.433 "claim_type": "exclusive_write", 00:05:43.433 "zoned": false, 00:05:43.433 "supported_io_types": { 00:05:43.433 "read": true, 00:05:43.433 "write": true, 00:05:43.433 "unmap": true, 00:05:43.433 "write_zeroes": true, 00:05:43.433 "flush": true, 00:05:43.433 "reset": true, 00:05:43.433 "compare": false, 00:05:43.433 "compare_and_write": false, 00:05:43.433 "abort": true, 00:05:43.433 "nvme_admin": false, 00:05:43.433 "nvme_io": false 00:05:43.433 }, 00:05:43.433 "memory_domains": [ 00:05:43.433 { 00:05:43.433 "dma_device_id": "system", 00:05:43.433 "dma_device_type": 1 00:05:43.433 }, 00:05:43.433 { 00:05:43.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.434 "dma_device_type": 2 00:05:43.434 } 00:05:43.434 ], 00:05:43.434 "driver_specific": {} 00:05:43.434 }, 00:05:43.434 { 00:05:43.434 "name": "Passthru0", 00:05:43.434 "aliases": [ 00:05:43.434 "785823a1-c3a3-5e48-a53f-bb6a7a3e9cfa" 00:05:43.434 ], 00:05:43.434 "product_name": "passthru", 00:05:43.434 "block_size": 512, 00:05:43.434 "num_blocks": 16384, 00:05:43.434 "uuid": "785823a1-c3a3-5e48-a53f-bb6a7a3e9cfa", 00:05:43.434 "assigned_rate_limits": { 00:05:43.434 "rw_ios_per_sec": 0, 00:05:43.434 "rw_mbytes_per_sec": 0, 00:05:43.434 "r_mbytes_per_sec": 0, 00:05:43.434 "w_mbytes_per_sec": 0 00:05:43.434 }, 00:05:43.434 "claimed": false, 00:05:43.434 "zoned": false, 00:05:43.434 "supported_io_types": { 00:05:43.434 "read": true, 00:05:43.434 "write": true, 00:05:43.434 "unmap": true, 00:05:43.434 "write_zeroes": true, 00:05:43.434 "flush": true, 00:05:43.434 "reset": true, 00:05:43.434 "compare": false, 00:05:43.434 "compare_and_write": false, 00:05:43.434 "abort": true, 00:05:43.434 "nvme_admin": false, 00:05:43.434 "nvme_io": false 00:05:43.434 }, 00:05:43.434 "memory_domains": [ 00:05:43.434 { 00:05:43.434 "dma_device_id": "system", 00:05:43.434 "dma_device_type": 1 00:05:43.434 }, 00:05:43.434 { 00:05:43.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:43.434 "dma_device_type": 2 00:05:43.434 } 00:05:43.434 ], 00:05:43.434 "driver_specific": { 00:05:43.434 "passthru": { 00:05:43.434 "name": "Passthru0", 00:05:43.434 "base_bdev_name": "Malloc2" 00:05:43.434 } 00:05:43.434 } 00:05:43.434 } 00:05:43.434 ]' 00:05:43.434 11:41:33 -- rpc/rpc.sh@21 -- # jq length 00:05:43.434 11:41:33 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:43.434 11:41:33 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:43.434 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.434 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.434 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.434 11:41:33 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:43.434 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.434 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.693 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.693 11:41:33 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:43.693 11:41:33 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:43.693 11:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:43.693 11:41:33 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:43.693 11:41:33 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:43.693 11:41:33 -- rpc/rpc.sh@26 -- # jq length 00:05:43.693 11:41:34 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:43.693 00:05:43.693 real 0m0.308s 00:05:43.693 user 0m0.173s 00:05:43.693 sys 0m0.051s 00:05:43.693 11:41:34 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:43.693 11:41:34 -- common/autotest_common.sh@10 -- # set +x 00:05:43.693 ************************************ 00:05:43.693 END TEST rpc_daemon_integrity 00:05:43.693 ************************************ 00:05:43.693 11:41:34 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:43.693 11:41:34 -- rpc/rpc.sh@84 -- # killprocess 339598 00:05:43.693 11:41:34 -- common/autotest_common.sh@936 -- # '[' -z 339598 ']' 00:05:43.693 11:41:34 -- common/autotest_common.sh@940 -- # kill -0 339598 00:05:43.693 11:41:34 -- common/autotest_common.sh@941 -- # uname 00:05:43.693 11:41:34 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:43.693 11:41:34 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 339598 00:05:43.693 11:41:34 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:43.693 11:41:34 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:43.693 11:41:34 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 339598' 00:05:43.693 killing process with pid 339598 00:05:43.693 11:41:34 -- common/autotest_common.sh@955 -- # kill 339598 00:05:43.693 11:41:34 -- common/autotest_common.sh@960 -- # wait 339598 00:05:45.596 00:05:45.596 real 0m4.392s 00:05:45.596 user 0m5.070s 00:05:45.596 sys 0m1.151s 00:05:45.596 11:41:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:45.596 11:41:35 -- common/autotest_common.sh@10 -- # set +x 00:05:45.596 ************************************ 00:05:45.596 END TEST rpc 00:05:45.596 ************************************ 00:05:45.597 11:41:35 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:45.597 11:41:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.597 11:41:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.597 11:41:35 -- common/autotest_common.sh@10 -- # set +x 00:05:45.597 ************************************ 00:05:45.597 START TEST skip_rpc 00:05:45.597 ************************************ 00:05:45.597 11:41:35 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:45.597 * Looking for test storage... 00:05:45.597 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:45.597 11:41:35 -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:45.597 11:41:35 -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:05:45.597 11:41:35 -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:45.597 11:41:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.597 11:41:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.597 11:41:35 -- common/autotest_common.sh@10 -- # set +x 00:05:45.597 ************************************ 00:05:45.597 START TEST skip_rpc 00:05:45.597 ************************************ 00:05:45.597 11:41:36 -- common/autotest_common.sh@1111 -- # test_skip_rpc 00:05:45.597 11:41:36 -- rpc/skip_rpc.sh@16 -- # local spdk_pid=340484 00:05:45.597 11:41:36 -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:45.597 11:41:36 -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:45.597 11:41:36 -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:45.856 [2024-04-18 11:41:36.182545] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:05:45.856 [2024-04-18 11:41:36.182616] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid340484 ] 00:05:45.856 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.856 [2024-04-18 11:41:36.317921] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.115 [2024-04-18 11:41:36.486972] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.386 11:41:41 -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:51.386 11:41:41 -- common/autotest_common.sh@638 -- # local es=0 00:05:51.386 11:41:41 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:51.386 11:41:41 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:05:51.386 11:41:41 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:51.386 11:41:41 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:05:51.386 11:41:41 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:51.386 11:41:41 -- common/autotest_common.sh@641 -- # rpc_cmd spdk_get_version 00:05:51.386 11:41:41 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.386 11:41:41 -- common/autotest_common.sh@10 -- # set +x 00:05:51.386 11:41:41 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:05:51.386 11:41:41 -- common/autotest_common.sh@641 -- # es=1 00:05:51.386 11:41:41 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:51.386 11:41:41 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:51.386 11:41:41 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:51.386 11:41:41 -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:51.386 11:41:41 -- rpc/skip_rpc.sh@23 -- # killprocess 340484 00:05:51.386 11:41:41 -- common/autotest_common.sh@936 -- # '[' -z 340484 ']' 00:05:51.386 11:41:41 -- common/autotest_common.sh@940 -- # kill -0 340484 00:05:51.386 11:41:41 -- common/autotest_common.sh@941 -- # uname 00:05:51.386 11:41:41 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:51.386 11:41:41 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 340484 00:05:51.386 11:41:41 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:51.386 11:41:41 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:51.386 11:41:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 340484' 00:05:51.386 killing process with pid 340484 00:05:51.386 11:41:41 -- common/autotest_common.sh@955 -- # kill 340484 00:05:51.386 11:41:41 -- common/autotest_common.sh@960 -- # wait 340484 00:05:52.322 00:05:52.322 real 0m6.596s 00:05:52.322 user 0m6.194s 00:05:52.322 sys 0m0.412s 00:05:52.322 11:41:42 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:52.322 11:41:42 -- common/autotest_common.sh@10 -- # set +x 00:05:52.322 ************************************ 00:05:52.322 END TEST skip_rpc 00:05:52.322 ************************************ 00:05:52.322 11:41:42 -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:52.322 11:41:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:52.322 11:41:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:52.322 11:41:42 -- common/autotest_common.sh@10 -- # set +x 00:05:52.581 ************************************ 00:05:52.581 START TEST skip_rpc_with_json 00:05:52.581 ************************************ 00:05:52.581 11:41:42 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_json 00:05:52.581 11:41:42 -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:52.581 11:41:42 -- rpc/skip_rpc.sh@28 -- # local spdk_pid=341395 00:05:52.581 11:41:42 -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:52.581 11:41:42 -- rpc/skip_rpc.sh@31 -- # waitforlisten 341395 00:05:52.581 11:41:42 -- common/autotest_common.sh@817 -- # '[' -z 341395 ']' 00:05:52.581 11:41:42 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:52.581 11:41:42 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:52.581 11:41:42 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:52.581 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:52.581 11:41:42 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:52.581 11:41:42 -- common/autotest_common.sh@10 -- # set +x 00:05:52.581 11:41:42 -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:52.581 [2024-04-18 11:41:42.966419] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:05:52.581 [2024-04-18 11:41:42.966501] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid341395 ] 00:05:52.581 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.581 [2024-04-18 11:41:43.104990] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.840 [2024-04-18 11:41:43.272532] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.408 11:41:43 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:53.408 11:41:43 -- common/autotest_common.sh@850 -- # return 0 00:05:53.408 11:41:43 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:53.408 11:41:43 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:53.408 11:41:43 -- common/autotest_common.sh@10 -- # set +x 00:05:53.408 [2024-04-18 11:41:43.878549] nvmf_rpc.c:2509:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:53.408 request: 00:05:53.408 { 00:05:53.408 "trtype": "tcp", 00:05:53.408 "method": "nvmf_get_transports", 00:05:53.408 "req_id": 1 00:05:53.408 } 00:05:53.408 Got JSON-RPC error response 00:05:53.408 response: 00:05:53.408 { 00:05:53.408 "code": -19, 00:05:53.408 "message": "No such device" 00:05:53.408 } 00:05:53.408 11:41:43 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:05:53.408 11:41:43 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:53.408 11:41:43 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:53.408 11:41:43 -- common/autotest_common.sh@10 -- # set +x 00:05:53.408 [2024-04-18 11:41:43.886663] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:53.408 11:41:43 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:53.408 11:41:43 -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:53.408 11:41:43 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:53.408 11:41:43 -- common/autotest_common.sh@10 -- # set +x 00:05:53.669 11:41:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:53.669 11:41:44 -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:53.669 { 00:05:53.669 "subsystems": [ 00:05:53.669 { 00:05:53.669 "subsystem": "scheduler", 00:05:53.669 "config": [ 00:05:53.669 { 00:05:53.669 "method": "framework_set_scheduler", 00:05:53.669 "params": { 00:05:53.669 "name": "static" 00:05:53.669 } 00:05:53.669 } 00:05:53.669 ] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "vmd", 00:05:53.669 "config": [] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "sock", 00:05:53.669 "config": [ 00:05:53.669 { 00:05:53.669 "method": "sock_impl_set_options", 00:05:53.669 "params": { 00:05:53.669 "impl_name": "posix", 00:05:53.669 "recv_buf_size": 2097152, 00:05:53.669 "send_buf_size": 2097152, 00:05:53.669 "enable_recv_pipe": true, 00:05:53.669 "enable_quickack": false, 00:05:53.669 "enable_placement_id": 0, 00:05:53.669 "enable_zerocopy_send_server": true, 00:05:53.669 "enable_zerocopy_send_client": false, 00:05:53.669 "zerocopy_threshold": 0, 00:05:53.669 "tls_version": 0, 00:05:53.669 "enable_ktls": false 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "sock_impl_set_options", 00:05:53.669 "params": { 00:05:53.669 "impl_name": "ssl", 00:05:53.669 "recv_buf_size": 4096, 00:05:53.669 "send_buf_size": 4096, 00:05:53.669 "enable_recv_pipe": true, 00:05:53.669 "enable_quickack": false, 00:05:53.669 "enable_placement_id": 0, 00:05:53.669 "enable_zerocopy_send_server": true, 00:05:53.669 "enable_zerocopy_send_client": false, 00:05:53.669 "zerocopy_threshold": 0, 00:05:53.669 "tls_version": 0, 00:05:53.669 "enable_ktls": false 00:05:53.669 } 00:05:53.669 } 00:05:53.669 ] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "iobuf", 00:05:53.669 "config": [ 00:05:53.669 { 00:05:53.669 "method": "iobuf_set_options", 00:05:53.669 "params": { 00:05:53.669 "small_pool_count": 8192, 00:05:53.669 "large_pool_count": 1024, 00:05:53.669 "small_bufsize": 8192, 00:05:53.669 "large_bufsize": 135168 00:05:53.669 } 00:05:53.669 } 00:05:53.669 ] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "keyring", 00:05:53.669 "config": [] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "vfio_user_target", 00:05:53.669 "config": null 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "accel", 00:05:53.669 "config": [ 00:05:53.669 { 00:05:53.669 "method": "accel_set_options", 00:05:53.669 "params": { 00:05:53.669 "small_cache_size": 128, 00:05:53.669 "large_cache_size": 16, 00:05:53.669 "task_count": 2048, 00:05:53.669 "sequence_count": 2048, 00:05:53.669 "buf_count": 2048 00:05:53.669 } 00:05:53.669 } 00:05:53.669 ] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "bdev", 00:05:53.669 "config": [ 00:05:53.669 { 00:05:53.669 "method": "bdev_set_options", 00:05:53.669 "params": { 00:05:53.669 "bdev_io_pool_size": 65535, 00:05:53.669 "bdev_io_cache_size": 256, 00:05:53.669 "bdev_auto_examine": true, 00:05:53.669 "iobuf_small_cache_size": 128, 00:05:53.669 "iobuf_large_cache_size": 16 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "bdev_raid_set_options", 00:05:53.669 "params": { 00:05:53.669 "process_window_size_kb": 1024 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "bdev_nvme_set_options", 00:05:53.669 "params": { 00:05:53.669 "action_on_timeout": "none", 00:05:53.669 "timeout_us": 0, 00:05:53.669 "timeout_admin_us": 0, 00:05:53.669 "keep_alive_timeout_ms": 10000, 00:05:53.669 "arbitration_burst": 0, 00:05:53.669 "low_priority_weight": 0, 00:05:53.669 "medium_priority_weight": 0, 00:05:53.669 "high_priority_weight": 0, 00:05:53.669 "nvme_adminq_poll_period_us": 10000, 00:05:53.669 "nvme_ioq_poll_period_us": 0, 00:05:53.669 "io_queue_requests": 0, 00:05:53.669 "delay_cmd_submit": true, 00:05:53.669 "transport_retry_count": 4, 00:05:53.669 "bdev_retry_count": 3, 00:05:53.669 "transport_ack_timeout": 0, 00:05:53.669 "ctrlr_loss_timeout_sec": 0, 00:05:53.669 "reconnect_delay_sec": 0, 00:05:53.669 "fast_io_fail_timeout_sec": 0, 00:05:53.669 "disable_auto_failback": false, 00:05:53.669 "generate_uuids": false, 00:05:53.669 "transport_tos": 0, 00:05:53.669 "nvme_error_stat": false, 00:05:53.669 "rdma_srq_size": 0, 00:05:53.669 "io_path_stat": false, 00:05:53.669 "allow_accel_sequence": false, 00:05:53.669 "rdma_max_cq_size": 0, 00:05:53.669 "rdma_cm_event_timeout_ms": 0, 00:05:53.669 "dhchap_digests": [ 00:05:53.669 "sha256", 00:05:53.669 "sha384", 00:05:53.669 "sha512" 00:05:53.669 ], 00:05:53.669 "dhchap_dhgroups": [ 00:05:53.669 "null", 00:05:53.669 "ffdhe2048", 00:05:53.669 "ffdhe3072", 00:05:53.669 "ffdhe4096", 00:05:53.669 "ffdhe6144", 00:05:53.669 "ffdhe8192" 00:05:53.669 ] 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "bdev_nvme_set_hotplug", 00:05:53.669 "params": { 00:05:53.669 "period_us": 100000, 00:05:53.669 "enable": false 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "bdev_iscsi_set_options", 00:05:53.669 "params": { 00:05:53.669 "timeout_sec": 30 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "bdev_wait_for_examine" 00:05:53.669 } 00:05:53.669 ] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "nvmf", 00:05:53.669 "config": [ 00:05:53.669 { 00:05:53.669 "method": "nvmf_set_config", 00:05:53.669 "params": { 00:05:53.669 "discovery_filter": "match_any", 00:05:53.669 "admin_cmd_passthru": { 00:05:53.669 "identify_ctrlr": false 00:05:53.669 } 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "nvmf_set_max_subsystems", 00:05:53.669 "params": { 00:05:53.669 "max_subsystems": 1024 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "nvmf_set_crdt", 00:05:53.669 "params": { 00:05:53.669 "crdt1": 0, 00:05:53.669 "crdt2": 0, 00:05:53.669 "crdt3": 0 00:05:53.669 } 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "method": "nvmf_create_transport", 00:05:53.669 "params": { 00:05:53.669 "trtype": "TCP", 00:05:53.669 "max_queue_depth": 128, 00:05:53.669 "max_io_qpairs_per_ctrlr": 127, 00:05:53.669 "in_capsule_data_size": 4096, 00:05:53.669 "max_io_size": 131072, 00:05:53.669 "io_unit_size": 131072, 00:05:53.669 "max_aq_depth": 128, 00:05:53.669 "num_shared_buffers": 511, 00:05:53.669 "buf_cache_size": 4294967295, 00:05:53.669 "dif_insert_or_strip": false, 00:05:53.669 "zcopy": false, 00:05:53.669 "c2h_success": true, 00:05:53.669 "sock_priority": 0, 00:05:53.669 "abort_timeout_sec": 1, 00:05:53.669 "ack_timeout": 0 00:05:53.669 } 00:05:53.669 } 00:05:53.669 ] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "nbd", 00:05:53.669 "config": [] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "ublk", 00:05:53.669 "config": [] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "vhost_blk", 00:05:53.669 "config": [] 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "scsi", 00:05:53.669 "config": null 00:05:53.669 }, 00:05:53.669 { 00:05:53.669 "subsystem": "iscsi", 00:05:53.669 "config": [ 00:05:53.669 { 00:05:53.669 "method": "iscsi_set_options", 00:05:53.669 "params": { 00:05:53.669 "node_base": "iqn.2016-06.io.spdk", 00:05:53.669 "max_sessions": 128, 00:05:53.669 "max_connections_per_session": 2, 00:05:53.669 "max_queue_depth": 64, 00:05:53.669 "default_time2wait": 2, 00:05:53.669 "default_time2retain": 20, 00:05:53.669 "first_burst_length": 8192, 00:05:53.669 "immediate_data": true, 00:05:53.669 "allow_duplicated_isid": false, 00:05:53.669 "error_recovery_level": 0, 00:05:53.669 "nop_timeout": 60, 00:05:53.669 "nop_in_interval": 30, 00:05:53.669 "disable_chap": false, 00:05:53.669 "require_chap": false, 00:05:53.669 "mutual_chap": false, 00:05:53.669 "chap_group": 0, 00:05:53.669 "max_large_datain_per_connection": 64, 00:05:53.669 "max_r2t_per_connection": 4, 00:05:53.669 "pdu_pool_size": 36864, 00:05:53.669 "immediate_data_pool_size": 16384, 00:05:53.669 "data_out_pool_size": 2048 00:05:53.669 } 00:05:53.669 } 00:05:53.669 ] 00:05:53.669 }, 00:05:53.669 { 00:05:53.670 "subsystem": "vhost_scsi", 00:05:53.670 "config": [] 00:05:53.670 } 00:05:53.670 ] 00:05:53.670 } 00:05:53.670 11:41:44 -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:53.670 11:41:44 -- rpc/skip_rpc.sh@40 -- # killprocess 341395 00:05:53.670 11:41:44 -- common/autotest_common.sh@936 -- # '[' -z 341395 ']' 00:05:53.670 11:41:44 -- common/autotest_common.sh@940 -- # kill -0 341395 00:05:53.670 11:41:44 -- common/autotest_common.sh@941 -- # uname 00:05:53.670 11:41:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:53.670 11:41:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 341395 00:05:53.670 11:41:44 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:53.670 11:41:44 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:53.670 11:41:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 341395' 00:05:53.670 killing process with pid 341395 00:05:53.670 11:41:44 -- common/autotest_common.sh@955 -- # kill 341395 00:05:53.670 11:41:44 -- common/autotest_common.sh@960 -- # wait 341395 00:05:55.576 11:41:45 -- rpc/skip_rpc.sh@47 -- # local spdk_pid=341758 00:05:55.576 11:41:45 -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:55.576 11:41:45 -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:00.854 11:41:50 -- rpc/skip_rpc.sh@50 -- # killprocess 341758 00:06:00.854 11:41:50 -- common/autotest_common.sh@936 -- # '[' -z 341758 ']' 00:06:00.854 11:41:50 -- common/autotest_common.sh@940 -- # kill -0 341758 00:06:00.854 11:41:50 -- common/autotest_common.sh@941 -- # uname 00:06:00.854 11:41:50 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:00.855 11:41:50 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 341758 00:06:00.855 11:41:50 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:00.855 11:41:50 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:00.855 11:41:50 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 341758' 00:06:00.855 killing process with pid 341758 00:06:00.855 11:41:50 -- common/autotest_common.sh@955 -- # kill 341758 00:06:00.855 11:41:50 -- common/autotest_common.sh@960 -- # wait 341758 00:06:01.861 11:41:52 -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:01.861 11:41:52 -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:01.861 00:06:01.861 real 0m9.305s 00:06:01.861 user 0m8.764s 00:06:01.861 sys 0m0.927s 00:06:01.861 11:41:52 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:01.861 11:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:01.861 ************************************ 00:06:01.861 END TEST skip_rpc_with_json 00:06:01.861 ************************************ 00:06:01.861 11:41:52 -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:01.861 11:41:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:01.861 11:41:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:01.861 11:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:01.861 ************************************ 00:06:01.861 START TEST skip_rpc_with_delay 00:06:01.861 ************************************ 00:06:01.861 11:41:52 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_delay 00:06:01.861 11:41:52 -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:01.861 11:41:52 -- common/autotest_common.sh@638 -- # local es=0 00:06:01.861 11:41:52 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:01.861 11:41:52 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:01.861 11:41:52 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:01.861 11:41:52 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:01.861 11:41:52 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:01.861 11:41:52 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:01.861 11:41:52 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:01.861 11:41:52 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:01.861 11:41:52 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:01.861 11:41:52 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:02.121 [2024-04-18 11:41:52.445928] app.c: 751:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:02.121 [2024-04-18 11:41:52.446021] app.c: 630:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:02.121 11:41:52 -- common/autotest_common.sh@641 -- # es=1 00:06:02.121 11:41:52 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:02.121 11:41:52 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:02.121 11:41:52 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:02.121 00:06:02.121 real 0m0.084s 00:06:02.121 user 0m0.033s 00:06:02.121 sys 0m0.050s 00:06:02.121 11:41:52 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:02.121 11:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:02.121 ************************************ 00:06:02.121 END TEST skip_rpc_with_delay 00:06:02.121 ************************************ 00:06:02.121 11:41:52 -- rpc/skip_rpc.sh@77 -- # uname 00:06:02.121 11:41:52 -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:02.121 11:41:52 -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:02.121 11:41:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:02.121 11:41:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.121 11:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:02.121 ************************************ 00:06:02.121 START TEST exit_on_failed_rpc_init 00:06:02.121 ************************************ 00:06:02.121 11:41:52 -- common/autotest_common.sh@1111 -- # test_exit_on_failed_rpc_init 00:06:02.121 11:41:52 -- rpc/skip_rpc.sh@62 -- # local spdk_pid=342710 00:06:02.121 11:41:52 -- rpc/skip_rpc.sh@63 -- # waitforlisten 342710 00:06:02.121 11:41:52 -- common/autotest_common.sh@817 -- # '[' -z 342710 ']' 00:06:02.121 11:41:52 -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:02.121 11:41:52 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:02.121 11:41:52 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:02.121 11:41:52 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:02.121 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:02.121 11:41:52 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:02.121 11:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:02.380 [2024-04-18 11:41:52.688159] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:02.380 [2024-04-18 11:41:52.688237] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid342710 ] 00:06:02.380 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.380 [2024-04-18 11:41:52.831201] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.640 [2024-04-18 11:41:53.005040] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.208 11:41:53 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:03.208 11:41:53 -- common/autotest_common.sh@850 -- # return 0 00:06:03.208 11:41:53 -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:03.208 11:41:53 -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:03.208 11:41:53 -- common/autotest_common.sh@638 -- # local es=0 00:06:03.208 11:41:53 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:03.208 11:41:53 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.208 11:41:53 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:03.208 11:41:53 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.208 11:41:53 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:03.208 11:41:53 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.208 11:41:53 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:03.208 11:41:53 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:03.208 11:41:53 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:03.208 11:41:53 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:03.208 [2024-04-18 11:41:53.646439] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:03.208 [2024-04-18 11:41:53.646528] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid342887 ] 00:06:03.208 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.467 [2024-04-18 11:41:53.787271] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.468 [2024-04-18 11:41:53.961981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:03.468 [2024-04-18 11:41:53.962095] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:03.468 [2024-04-18 11:41:53.962115] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:03.468 [2024-04-18 11:41:53.962126] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:03.727 11:41:54 -- common/autotest_common.sh@641 -- # es=234 00:06:03.727 11:41:54 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:03.727 11:41:54 -- common/autotest_common.sh@650 -- # es=106 00:06:03.727 11:41:54 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:03.727 11:41:54 -- common/autotest_common.sh@658 -- # es=1 00:06:03.727 11:41:54 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:03.727 11:41:54 -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:03.727 11:41:54 -- rpc/skip_rpc.sh@70 -- # killprocess 342710 00:06:03.727 11:41:54 -- common/autotest_common.sh@936 -- # '[' -z 342710 ']' 00:06:03.727 11:41:54 -- common/autotest_common.sh@940 -- # kill -0 342710 00:06:03.727 11:41:54 -- common/autotest_common.sh@941 -- # uname 00:06:03.727 11:41:54 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:03.727 11:41:54 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 342710 00:06:03.987 11:41:54 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:03.987 11:41:54 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:03.987 11:41:54 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 342710' 00:06:03.987 killing process with pid 342710 00:06:03.987 11:41:54 -- common/autotest_common.sh@955 -- # kill 342710 00:06:03.987 11:41:54 -- common/autotest_common.sh@960 -- # wait 342710 00:06:05.367 00:06:05.367 real 0m3.167s 00:06:05.367 user 0m3.469s 00:06:05.367 sys 0m0.683s 00:06:05.367 11:41:55 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:05.367 11:41:55 -- common/autotest_common.sh@10 -- # set +x 00:06:05.367 ************************************ 00:06:05.367 END TEST exit_on_failed_rpc_init 00:06:05.367 ************************************ 00:06:05.367 11:41:55 -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:05.367 00:06:05.367 real 0m19.982s 00:06:05.367 user 0m18.748s 00:06:05.367 sys 0m2.559s 00:06:05.367 11:41:55 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:05.367 11:41:55 -- common/autotest_common.sh@10 -- # set +x 00:06:05.367 ************************************ 00:06:05.367 END TEST skip_rpc 00:06:05.367 ************************************ 00:06:05.367 11:41:55 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:05.367 11:41:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:05.367 11:41:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.367 11:41:55 -- common/autotest_common.sh@10 -- # set +x 00:06:05.627 ************************************ 00:06:05.627 START TEST rpc_client 00:06:05.627 ************************************ 00:06:05.627 11:41:56 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:05.627 * Looking for test storage... 00:06:05.627 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:05.627 11:41:56 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:05.886 OK 00:06:05.886 11:41:56 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:05.886 00:06:05.886 real 0m0.161s 00:06:05.886 user 0m0.065s 00:06:05.886 sys 0m0.107s 00:06:05.886 11:41:56 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:05.886 11:41:56 -- common/autotest_common.sh@10 -- # set +x 00:06:05.886 ************************************ 00:06:05.886 END TEST rpc_client 00:06:05.886 ************************************ 00:06:05.886 11:41:56 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:05.886 11:41:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:05.886 11:41:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.886 11:41:56 -- common/autotest_common.sh@10 -- # set +x 00:06:05.886 ************************************ 00:06:05.886 START TEST json_config 00:06:05.886 ************************************ 00:06:05.886 11:41:56 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:06.145 11:41:56 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:06.145 11:41:56 -- nvmf/common.sh@7 -- # uname -s 00:06:06.145 11:41:56 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:06.145 11:41:56 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:06.145 11:41:56 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:06.145 11:41:56 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:06.145 11:41:56 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:06.145 11:41:56 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:06.145 11:41:56 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:06.145 11:41:56 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:06.145 11:41:56 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:06.145 11:41:56 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:06.145 11:41:56 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:06:06.145 11:41:56 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:06:06.145 11:41:56 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:06.145 11:41:56 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:06.145 11:41:56 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:06.145 11:41:56 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:06.145 11:41:56 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:06.145 11:41:56 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:06.145 11:41:56 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:06.145 11:41:56 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:06.145 11:41:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.145 11:41:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.145 11:41:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.145 11:41:56 -- paths/export.sh@5 -- # export PATH 00:06:06.145 11:41:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.145 11:41:56 -- nvmf/common.sh@47 -- # : 0 00:06:06.145 11:41:56 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:06.145 11:41:56 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:06.145 11:41:56 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:06.145 11:41:56 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:06.145 11:41:56 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:06.145 11:41:56 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:06.145 11:41:56 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:06.145 11:41:56 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:06.145 11:41:56 -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:06.145 11:41:56 -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:06.145 11:41:56 -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:06.145 11:41:56 -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:06.145 11:41:56 -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:06.145 11:41:56 -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:06.145 WARNING: No tests are enabled so not running JSON configuration tests 00:06:06.145 11:41:56 -- json_config/json_config.sh@28 -- # exit 0 00:06:06.145 00:06:06.145 real 0m0.115s 00:06:06.145 user 0m0.055s 00:06:06.146 sys 0m0.061s 00:06:06.146 11:41:56 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:06.146 11:41:56 -- common/autotest_common.sh@10 -- # set +x 00:06:06.146 ************************************ 00:06:06.146 END TEST json_config 00:06:06.146 ************************************ 00:06:06.146 11:41:56 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:06.146 11:41:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:06.146 11:41:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:06.146 11:41:56 -- common/autotest_common.sh@10 -- # set +x 00:06:06.406 ************************************ 00:06:06.406 START TEST json_config_extra_key 00:06:06.406 ************************************ 00:06:06.406 11:41:56 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:06.406 11:41:56 -- nvmf/common.sh@7 -- # uname -s 00:06:06.406 11:41:56 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:06.406 11:41:56 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:06.406 11:41:56 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:06.406 11:41:56 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:06.406 11:41:56 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:06.406 11:41:56 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:06.406 11:41:56 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:06.406 11:41:56 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:06.406 11:41:56 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:06.406 11:41:56 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:06.406 11:41:56 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:06:06.406 11:41:56 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:06:06.406 11:41:56 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:06.406 11:41:56 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:06.406 11:41:56 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:06.406 11:41:56 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:06.406 11:41:56 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:06.406 11:41:56 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:06.406 11:41:56 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:06.406 11:41:56 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:06.406 11:41:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.406 11:41:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.406 11:41:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.406 11:41:56 -- paths/export.sh@5 -- # export PATH 00:06:06.406 11:41:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:06.406 11:41:56 -- nvmf/common.sh@47 -- # : 0 00:06:06.406 11:41:56 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:06.406 11:41:56 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:06.406 11:41:56 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:06.406 11:41:56 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:06.406 11:41:56 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:06.406 11:41:56 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:06.406 11:41:56 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:06.406 11:41:56 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:06.406 INFO: launching applications... 00:06:06.406 11:41:56 -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:06.406 11:41:56 -- json_config/common.sh@9 -- # local app=target 00:06:06.406 11:41:56 -- json_config/common.sh@10 -- # shift 00:06:06.406 11:41:56 -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:06.406 11:41:56 -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:06.406 11:41:56 -- json_config/common.sh@15 -- # local app_extra_params= 00:06:06.406 11:41:56 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:06.406 11:41:56 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:06.406 11:41:56 -- json_config/common.sh@22 -- # app_pid["$app"]=343412 00:06:06.406 11:41:56 -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:06.406 Waiting for target to run... 00:06:06.406 11:41:56 -- json_config/common.sh@25 -- # waitforlisten 343412 /var/tmp/spdk_tgt.sock 00:06:06.406 11:41:56 -- common/autotest_common.sh@817 -- # '[' -z 343412 ']' 00:06:06.406 11:41:56 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:06.406 11:41:56 -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:06.406 11:41:56 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:06.406 11:41:56 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:06.406 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:06.406 11:41:56 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:06.406 11:41:56 -- common/autotest_common.sh@10 -- # set +x 00:06:06.407 [2024-04-18 11:41:56.875615] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:06.407 [2024-04-18 11:41:56.875712] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid343412 ] 00:06:06.666 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.925 [2024-04-18 11:41:57.245223] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.925 [2024-04-18 11:41:57.396001] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.494 11:41:57 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:07.494 11:41:57 -- common/autotest_common.sh@850 -- # return 0 00:06:07.494 11:41:57 -- json_config/common.sh@26 -- # echo '' 00:06:07.494 00:06:07.494 11:41:57 -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:07.494 INFO: shutting down applications... 00:06:07.494 11:41:57 -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:07.494 11:41:57 -- json_config/common.sh@31 -- # local app=target 00:06:07.494 11:41:57 -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:07.494 11:41:57 -- json_config/common.sh@35 -- # [[ -n 343412 ]] 00:06:07.494 11:41:57 -- json_config/common.sh@38 -- # kill -SIGINT 343412 00:06:07.494 11:41:57 -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:07.494 11:41:57 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:07.494 11:41:57 -- json_config/common.sh@41 -- # kill -0 343412 00:06:07.494 11:41:57 -- json_config/common.sh@45 -- # sleep 0.5 00:06:08.062 11:41:58 -- json_config/common.sh@40 -- # (( i++ )) 00:06:08.062 11:41:58 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:08.062 11:41:58 -- json_config/common.sh@41 -- # kill -0 343412 00:06:08.062 11:41:58 -- json_config/common.sh@45 -- # sleep 0.5 00:06:08.321 11:41:58 -- json_config/common.sh@40 -- # (( i++ )) 00:06:08.322 11:41:58 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:08.322 11:41:58 -- json_config/common.sh@41 -- # kill -0 343412 00:06:08.322 11:41:58 -- json_config/common.sh@45 -- # sleep 0.5 00:06:08.897 11:41:59 -- json_config/common.sh@40 -- # (( i++ )) 00:06:08.897 11:41:59 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:08.897 11:41:59 -- json_config/common.sh@41 -- # kill -0 343412 00:06:08.897 11:41:59 -- json_config/common.sh@45 -- # sleep 0.5 00:06:09.466 11:41:59 -- json_config/common.sh@40 -- # (( i++ )) 00:06:09.466 11:41:59 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:09.466 11:41:59 -- json_config/common.sh@41 -- # kill -0 343412 00:06:09.466 11:41:59 -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:09.466 11:41:59 -- json_config/common.sh@43 -- # break 00:06:09.466 11:41:59 -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:09.466 11:41:59 -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:09.466 SPDK target shutdown done 00:06:09.466 11:41:59 -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:09.466 Success 00:06:09.466 00:06:09.466 real 0m3.149s 00:06:09.466 user 0m2.679s 00:06:09.466 sys 0m0.561s 00:06:09.466 11:41:59 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:09.466 11:41:59 -- common/autotest_common.sh@10 -- # set +x 00:06:09.466 ************************************ 00:06:09.466 END TEST json_config_extra_key 00:06:09.466 ************************************ 00:06:09.466 11:41:59 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:09.466 11:41:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:09.466 11:41:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:09.466 11:41:59 -- common/autotest_common.sh@10 -- # set +x 00:06:09.725 ************************************ 00:06:09.725 START TEST alias_rpc 00:06:09.725 ************************************ 00:06:09.725 11:42:00 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:09.725 * Looking for test storage... 00:06:09.725 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:09.725 11:42:00 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:09.725 11:42:00 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=344006 00:06:09.725 11:42:00 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:09.725 11:42:00 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 344006 00:06:09.725 11:42:00 -- common/autotest_common.sh@817 -- # '[' -z 344006 ']' 00:06:09.725 11:42:00 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:09.725 11:42:00 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:09.725 11:42:00 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:09.725 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:09.725 11:42:00 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:09.725 11:42:00 -- common/autotest_common.sh@10 -- # set +x 00:06:09.725 [2024-04-18 11:42:00.208601] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:09.725 [2024-04-18 11:42:00.208691] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid344006 ] 00:06:09.985 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.985 [2024-04-18 11:42:00.351575] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.985 [2024-04-18 11:42:00.528383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.924 11:42:01 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:10.924 11:42:01 -- common/autotest_common.sh@850 -- # return 0 00:06:10.924 11:42:01 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:10.924 11:42:01 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 344006 00:06:10.924 11:42:01 -- common/autotest_common.sh@936 -- # '[' -z 344006 ']' 00:06:10.924 11:42:01 -- common/autotest_common.sh@940 -- # kill -0 344006 00:06:10.924 11:42:01 -- common/autotest_common.sh@941 -- # uname 00:06:10.924 11:42:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:10.924 11:42:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 344006 00:06:10.924 11:42:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:10.924 11:42:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:10.924 11:42:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 344006' 00:06:10.924 killing process with pid 344006 00:06:10.924 11:42:01 -- common/autotest_common.sh@955 -- # kill 344006 00:06:10.924 11:42:01 -- common/autotest_common.sh@960 -- # wait 344006 00:06:12.829 00:06:12.829 real 0m2.859s 00:06:12.829 user 0m2.810s 00:06:12.829 sys 0m0.591s 00:06:12.829 11:42:02 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:12.829 11:42:02 -- common/autotest_common.sh@10 -- # set +x 00:06:12.829 ************************************ 00:06:12.829 END TEST alias_rpc 00:06:12.829 ************************************ 00:06:12.829 11:42:02 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:06:12.829 11:42:02 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:12.829 11:42:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:12.829 11:42:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.829 11:42:02 -- common/autotest_common.sh@10 -- # set +x 00:06:12.829 ************************************ 00:06:12.829 START TEST spdkcli_tcp 00:06:12.829 ************************************ 00:06:12.829 11:42:03 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:12.829 * Looking for test storage... 00:06:12.829 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:12.829 11:42:03 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:12.829 11:42:03 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:12.829 11:42:03 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:12.829 11:42:03 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:12.829 11:42:03 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:12.829 11:42:03 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:12.829 11:42:03 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:12.829 11:42:03 -- common/autotest_common.sh@710 -- # xtrace_disable 00:06:12.830 11:42:03 -- common/autotest_common.sh@10 -- # set +x 00:06:12.830 11:42:03 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=344534 00:06:12.830 11:42:03 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:12.830 11:42:03 -- spdkcli/tcp.sh@27 -- # waitforlisten 344534 00:06:12.830 11:42:03 -- common/autotest_common.sh@817 -- # '[' -z 344534 ']' 00:06:12.830 11:42:03 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:12.830 11:42:03 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:12.830 11:42:03 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:12.830 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:12.830 11:42:03 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:12.830 11:42:03 -- common/autotest_common.sh@10 -- # set +x 00:06:12.830 [2024-04-18 11:42:03.255076] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:12.830 [2024-04-18 11:42:03.255173] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid344534 ] 00:06:12.830 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.089 [2024-04-18 11:42:03.400205] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:13.089 [2024-04-18 11:42:03.578625] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.089 [2024-04-18 11:42:03.578636] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:13.658 11:42:04 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:13.658 11:42:04 -- common/autotest_common.sh@850 -- # return 0 00:06:13.658 11:42:04 -- spdkcli/tcp.sh@31 -- # socat_pid=344708 00:06:13.658 11:42:04 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:13.658 11:42:04 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:13.919 [ 00:06:13.919 "spdk_get_version", 00:06:13.919 "rpc_get_methods", 00:06:13.919 "trace_get_info", 00:06:13.919 "trace_get_tpoint_group_mask", 00:06:13.919 "trace_disable_tpoint_group", 00:06:13.919 "trace_enable_tpoint_group", 00:06:13.919 "trace_clear_tpoint_mask", 00:06:13.919 "trace_set_tpoint_mask", 00:06:13.919 "vfu_tgt_set_base_path", 00:06:13.919 "framework_get_pci_devices", 00:06:13.919 "framework_get_config", 00:06:13.919 "framework_get_subsystems", 00:06:13.919 "keyring_get_keys", 00:06:13.919 "iobuf_get_stats", 00:06:13.919 "iobuf_set_options", 00:06:13.919 "sock_set_default_impl", 00:06:13.919 "sock_impl_set_options", 00:06:13.919 "sock_impl_get_options", 00:06:13.919 "vmd_rescan", 00:06:13.919 "vmd_remove_device", 00:06:13.919 "vmd_enable", 00:06:13.919 "accel_get_stats", 00:06:13.919 "accel_set_options", 00:06:13.919 "accel_set_driver", 00:06:13.919 "accel_crypto_key_destroy", 00:06:13.919 "accel_crypto_keys_get", 00:06:13.919 "accel_crypto_key_create", 00:06:13.919 "accel_assign_opc", 00:06:13.919 "accel_get_module_info", 00:06:13.919 "accel_get_opc_assignments", 00:06:13.919 "notify_get_notifications", 00:06:13.919 "notify_get_types", 00:06:13.919 "bdev_get_histogram", 00:06:13.919 "bdev_enable_histogram", 00:06:13.919 "bdev_set_qos_limit", 00:06:13.919 "bdev_set_qd_sampling_period", 00:06:13.919 "bdev_get_bdevs", 00:06:13.919 "bdev_reset_iostat", 00:06:13.919 "bdev_get_iostat", 00:06:13.919 "bdev_examine", 00:06:13.919 "bdev_wait_for_examine", 00:06:13.919 "bdev_set_options", 00:06:13.919 "scsi_get_devices", 00:06:13.919 "thread_set_cpumask", 00:06:13.919 "framework_get_scheduler", 00:06:13.919 "framework_set_scheduler", 00:06:13.919 "framework_get_reactors", 00:06:13.919 "thread_get_io_channels", 00:06:13.919 "thread_get_pollers", 00:06:13.919 "thread_get_stats", 00:06:13.919 "framework_monitor_context_switch", 00:06:13.919 "spdk_kill_instance", 00:06:13.919 "log_enable_timestamps", 00:06:13.919 "log_get_flags", 00:06:13.919 "log_clear_flag", 00:06:13.919 "log_set_flag", 00:06:13.919 "log_get_level", 00:06:13.919 "log_set_level", 00:06:13.919 "log_get_print_level", 00:06:13.919 "log_set_print_level", 00:06:13.919 "framework_enable_cpumask_locks", 00:06:13.919 "framework_disable_cpumask_locks", 00:06:13.919 "framework_wait_init", 00:06:13.919 "framework_start_init", 00:06:13.919 "virtio_blk_create_transport", 00:06:13.919 "virtio_blk_get_transports", 00:06:13.919 "vhost_controller_set_coalescing", 00:06:13.919 "vhost_get_controllers", 00:06:13.919 "vhost_delete_controller", 00:06:13.919 "vhost_create_blk_controller", 00:06:13.919 "vhost_scsi_controller_remove_target", 00:06:13.919 "vhost_scsi_controller_add_target", 00:06:13.919 "vhost_start_scsi_controller", 00:06:13.919 "vhost_create_scsi_controller", 00:06:13.919 "ublk_recover_disk", 00:06:13.919 "ublk_get_disks", 00:06:13.919 "ublk_stop_disk", 00:06:13.919 "ublk_start_disk", 00:06:13.919 "ublk_destroy_target", 00:06:13.919 "ublk_create_target", 00:06:13.919 "nbd_get_disks", 00:06:13.919 "nbd_stop_disk", 00:06:13.919 "nbd_start_disk", 00:06:13.919 "env_dpdk_get_mem_stats", 00:06:13.919 "nvmf_subsystem_get_listeners", 00:06:13.919 "nvmf_subsystem_get_qpairs", 00:06:13.919 "nvmf_subsystem_get_controllers", 00:06:13.919 "nvmf_get_stats", 00:06:13.919 "nvmf_get_transports", 00:06:13.919 "nvmf_create_transport", 00:06:13.919 "nvmf_get_targets", 00:06:13.919 "nvmf_delete_target", 00:06:13.919 "nvmf_create_target", 00:06:13.919 "nvmf_subsystem_allow_any_host", 00:06:13.919 "nvmf_subsystem_remove_host", 00:06:13.919 "nvmf_subsystem_add_host", 00:06:13.919 "nvmf_ns_remove_host", 00:06:13.919 "nvmf_ns_add_host", 00:06:13.919 "nvmf_subsystem_remove_ns", 00:06:13.919 "nvmf_subsystem_add_ns", 00:06:13.919 "nvmf_subsystem_listener_set_ana_state", 00:06:13.919 "nvmf_discovery_get_referrals", 00:06:13.919 "nvmf_discovery_remove_referral", 00:06:13.919 "nvmf_discovery_add_referral", 00:06:13.919 "nvmf_subsystem_remove_listener", 00:06:13.919 "nvmf_subsystem_add_listener", 00:06:13.919 "nvmf_delete_subsystem", 00:06:13.919 "nvmf_create_subsystem", 00:06:13.919 "nvmf_get_subsystems", 00:06:13.919 "nvmf_set_crdt", 00:06:13.919 "nvmf_set_config", 00:06:13.919 "nvmf_set_max_subsystems", 00:06:13.919 "iscsi_set_options", 00:06:13.919 "iscsi_get_auth_groups", 00:06:13.919 "iscsi_auth_group_remove_secret", 00:06:13.919 "iscsi_auth_group_add_secret", 00:06:13.919 "iscsi_delete_auth_group", 00:06:13.919 "iscsi_create_auth_group", 00:06:13.919 "iscsi_set_discovery_auth", 00:06:13.919 "iscsi_get_options", 00:06:13.919 "iscsi_target_node_request_logout", 00:06:13.919 "iscsi_target_node_set_redirect", 00:06:13.919 "iscsi_target_node_set_auth", 00:06:13.919 "iscsi_target_node_add_lun", 00:06:13.919 "iscsi_get_stats", 00:06:13.919 "iscsi_get_connections", 00:06:13.919 "iscsi_portal_group_set_auth", 00:06:13.919 "iscsi_start_portal_group", 00:06:13.919 "iscsi_delete_portal_group", 00:06:13.919 "iscsi_create_portal_group", 00:06:13.919 "iscsi_get_portal_groups", 00:06:13.919 "iscsi_delete_target_node", 00:06:13.919 "iscsi_target_node_remove_pg_ig_maps", 00:06:13.919 "iscsi_target_node_add_pg_ig_maps", 00:06:13.919 "iscsi_create_target_node", 00:06:13.919 "iscsi_get_target_nodes", 00:06:13.919 "iscsi_delete_initiator_group", 00:06:13.919 "iscsi_initiator_group_remove_initiators", 00:06:13.919 "iscsi_initiator_group_add_initiators", 00:06:13.919 "iscsi_create_initiator_group", 00:06:13.919 "iscsi_get_initiator_groups", 00:06:13.919 "keyring_file_remove_key", 00:06:13.919 "keyring_file_add_key", 00:06:13.919 "vfu_virtio_create_scsi_endpoint", 00:06:13.919 "vfu_virtio_scsi_remove_target", 00:06:13.919 "vfu_virtio_scsi_add_target", 00:06:13.919 "vfu_virtio_create_blk_endpoint", 00:06:13.919 "vfu_virtio_delete_endpoint", 00:06:13.919 "iaa_scan_accel_module", 00:06:13.919 "dsa_scan_accel_module", 00:06:13.919 "ioat_scan_accel_module", 00:06:13.920 "accel_error_inject_error", 00:06:13.920 "bdev_iscsi_delete", 00:06:13.920 "bdev_iscsi_create", 00:06:13.920 "bdev_iscsi_set_options", 00:06:13.920 "bdev_virtio_attach_controller", 00:06:13.920 "bdev_virtio_scsi_get_devices", 00:06:13.920 "bdev_virtio_detach_controller", 00:06:13.920 "bdev_virtio_blk_set_hotplug", 00:06:13.920 "bdev_ftl_set_property", 00:06:13.920 "bdev_ftl_get_properties", 00:06:13.920 "bdev_ftl_get_stats", 00:06:13.920 "bdev_ftl_unmap", 00:06:13.920 "bdev_ftl_unload", 00:06:13.920 "bdev_ftl_delete", 00:06:13.920 "bdev_ftl_load", 00:06:13.920 "bdev_ftl_create", 00:06:13.920 "bdev_aio_delete", 00:06:13.920 "bdev_aio_rescan", 00:06:13.920 "bdev_aio_create", 00:06:13.920 "blobfs_create", 00:06:13.920 "blobfs_detect", 00:06:13.920 "blobfs_set_cache_size", 00:06:13.920 "bdev_zone_block_delete", 00:06:13.920 "bdev_zone_block_create", 00:06:13.920 "bdev_delay_delete", 00:06:13.920 "bdev_delay_create", 00:06:13.920 "bdev_delay_update_latency", 00:06:13.920 "bdev_split_delete", 00:06:13.920 "bdev_split_create", 00:06:13.920 "bdev_error_inject_error", 00:06:13.920 "bdev_error_delete", 00:06:13.920 "bdev_error_create", 00:06:13.920 "bdev_raid_set_options", 00:06:13.920 "bdev_raid_remove_base_bdev", 00:06:13.920 "bdev_raid_add_base_bdev", 00:06:13.920 "bdev_raid_delete", 00:06:13.920 "bdev_raid_create", 00:06:13.920 "bdev_raid_get_bdevs", 00:06:13.920 "bdev_lvol_grow_lvstore", 00:06:13.920 "bdev_lvol_get_lvols", 00:06:13.920 "bdev_lvol_get_lvstores", 00:06:13.920 "bdev_lvol_delete", 00:06:13.920 "bdev_lvol_set_read_only", 00:06:13.920 "bdev_lvol_resize", 00:06:13.920 "bdev_lvol_decouple_parent", 00:06:13.920 "bdev_lvol_inflate", 00:06:13.920 "bdev_lvol_rename", 00:06:13.920 "bdev_lvol_clone_bdev", 00:06:13.920 "bdev_lvol_clone", 00:06:13.920 "bdev_lvol_snapshot", 00:06:13.920 "bdev_lvol_create", 00:06:13.920 "bdev_lvol_delete_lvstore", 00:06:13.920 "bdev_lvol_rename_lvstore", 00:06:13.920 "bdev_lvol_create_lvstore", 00:06:13.920 "bdev_passthru_delete", 00:06:13.920 "bdev_passthru_create", 00:06:13.920 "bdev_nvme_cuse_unregister", 00:06:13.920 "bdev_nvme_cuse_register", 00:06:13.920 "bdev_opal_new_user", 00:06:13.920 "bdev_opal_set_lock_state", 00:06:13.920 "bdev_opal_delete", 00:06:13.920 "bdev_opal_get_info", 00:06:13.920 "bdev_opal_create", 00:06:13.920 "bdev_nvme_opal_revert", 00:06:13.920 "bdev_nvme_opal_init", 00:06:13.920 "bdev_nvme_send_cmd", 00:06:13.920 "bdev_nvme_get_path_iostat", 00:06:13.920 "bdev_nvme_get_mdns_discovery_info", 00:06:13.920 "bdev_nvme_stop_mdns_discovery", 00:06:13.920 "bdev_nvme_start_mdns_discovery", 00:06:13.920 "bdev_nvme_set_multipath_policy", 00:06:13.920 "bdev_nvme_set_preferred_path", 00:06:13.920 "bdev_nvme_get_io_paths", 00:06:13.920 "bdev_nvme_remove_error_injection", 00:06:13.920 "bdev_nvme_add_error_injection", 00:06:13.920 "bdev_nvme_get_discovery_info", 00:06:13.920 "bdev_nvme_stop_discovery", 00:06:13.920 "bdev_nvme_start_discovery", 00:06:13.920 "bdev_nvme_get_controller_health_info", 00:06:13.920 "bdev_nvme_disable_controller", 00:06:13.920 "bdev_nvme_enable_controller", 00:06:13.920 "bdev_nvme_reset_controller", 00:06:13.920 "bdev_nvme_get_transport_statistics", 00:06:13.920 "bdev_nvme_apply_firmware", 00:06:13.920 "bdev_nvme_detach_controller", 00:06:13.920 "bdev_nvme_get_controllers", 00:06:13.920 "bdev_nvme_attach_controller", 00:06:13.920 "bdev_nvme_set_hotplug", 00:06:13.920 "bdev_nvme_set_options", 00:06:13.920 "bdev_null_resize", 00:06:13.920 "bdev_null_delete", 00:06:13.920 "bdev_null_create", 00:06:13.920 "bdev_malloc_delete", 00:06:13.920 "bdev_malloc_create" 00:06:13.920 ] 00:06:13.920 11:42:04 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:13.920 11:42:04 -- common/autotest_common.sh@716 -- # xtrace_disable 00:06:13.920 11:42:04 -- common/autotest_common.sh@10 -- # set +x 00:06:13.920 11:42:04 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:13.920 11:42:04 -- spdkcli/tcp.sh@38 -- # killprocess 344534 00:06:13.920 11:42:04 -- common/autotest_common.sh@936 -- # '[' -z 344534 ']' 00:06:13.920 11:42:04 -- common/autotest_common.sh@940 -- # kill -0 344534 00:06:13.920 11:42:04 -- common/autotest_common.sh@941 -- # uname 00:06:13.920 11:42:04 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:13.920 11:42:04 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 344534 00:06:13.920 11:42:04 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:13.920 11:42:04 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:13.920 11:42:04 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 344534' 00:06:13.920 killing process with pid 344534 00:06:13.920 11:42:04 -- common/autotest_common.sh@955 -- # kill 344534 00:06:13.920 11:42:04 -- common/autotest_common.sh@960 -- # wait 344534 00:06:15.828 00:06:15.828 real 0m2.943s 00:06:15.828 user 0m5.048s 00:06:15.828 sys 0m0.646s 00:06:15.828 11:42:06 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:15.828 11:42:06 -- common/autotest_common.sh@10 -- # set +x 00:06:15.828 ************************************ 00:06:15.828 END TEST spdkcli_tcp 00:06:15.828 ************************************ 00:06:15.828 11:42:06 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:15.828 11:42:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:15.828 11:42:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:15.828 11:42:06 -- common/autotest_common.sh@10 -- # set +x 00:06:15.828 ************************************ 00:06:15.828 START TEST dpdk_mem_utility 00:06:15.828 ************************************ 00:06:15.828 11:42:06 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:15.828 * Looking for test storage... 00:06:15.828 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:15.828 11:42:06 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:15.828 11:42:06 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=345363 00:06:15.828 11:42:06 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 345363 00:06:15.828 11:42:06 -- common/autotest_common.sh@817 -- # '[' -z 345363 ']' 00:06:15.828 11:42:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.828 11:42:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:15.828 11:42:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.828 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.828 11:42:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:15.828 11:42:06 -- common/autotest_common.sh@10 -- # set +x 00:06:15.828 11:42:06 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:16.087 [2024-04-18 11:42:06.391829] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:16.087 [2024-04-18 11:42:06.391919] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid345363 ] 00:06:16.087 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.087 [2024-04-18 11:42:06.533789] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.347 [2024-04-18 11:42:06.709376] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.940 11:42:07 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:16.940 11:42:07 -- common/autotest_common.sh@850 -- # return 0 00:06:16.940 11:42:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:16.940 11:42:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:16.940 11:42:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:16.940 11:42:07 -- common/autotest_common.sh@10 -- # set +x 00:06:16.940 { 00:06:16.940 "filename": "/tmp/spdk_mem_dump.txt" 00:06:16.940 } 00:06:16.940 11:42:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:16.940 11:42:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:16.940 DPDK memory size 820.000000 MiB in 1 heap(s) 00:06:16.940 1 heaps totaling size 820.000000 MiB 00:06:16.940 size: 820.000000 MiB heap id: 0 00:06:16.940 end heaps---------- 00:06:16.940 8 mempools totaling size 598.116089 MiB 00:06:16.940 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:16.940 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:16.940 size: 84.521057 MiB name: bdev_io_345363 00:06:16.940 size: 51.011292 MiB name: evtpool_345363 00:06:16.940 size: 50.003479 MiB name: msgpool_345363 00:06:16.940 size: 21.763794 MiB name: PDU_Pool 00:06:16.940 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:16.940 size: 0.026123 MiB name: Session_Pool 00:06:16.940 end mempools------- 00:06:16.940 6 memzones totaling size 4.142822 MiB 00:06:16.940 size: 1.000366 MiB name: RG_ring_0_345363 00:06:16.940 size: 1.000366 MiB name: RG_ring_1_345363 00:06:16.940 size: 1.000366 MiB name: RG_ring_4_345363 00:06:16.940 size: 1.000366 MiB name: RG_ring_5_345363 00:06:16.940 size: 0.125366 MiB name: RG_ring_2_345363 00:06:16.940 size: 0.015991 MiB name: RG_ring_3_345363 00:06:16.940 end memzones------- 00:06:16.940 11:42:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:16.940 heap id: 0 total size: 820.000000 MiB number of busy elements: 41 number of free elements: 19 00:06:16.940 list of free elements. size: 18.514832 MiB 00:06:16.940 element at address: 0x200000400000 with size: 1.999451 MiB 00:06:16.940 element at address: 0x200000800000 with size: 1.996887 MiB 00:06:16.940 element at address: 0x200007000000 with size: 1.995972 MiB 00:06:16.940 element at address: 0x20000b200000 with size: 1.995972 MiB 00:06:16.940 element at address: 0x200019100040 with size: 0.999939 MiB 00:06:16.940 element at address: 0x200019500040 with size: 0.999939 MiB 00:06:16.940 element at address: 0x200019600000 with size: 0.999329 MiB 00:06:16.940 element at address: 0x200003e00000 with size: 0.996094 MiB 00:06:16.940 element at address: 0x200032200000 with size: 0.994324 MiB 00:06:16.940 element at address: 0x200018e00000 with size: 0.959900 MiB 00:06:16.940 element at address: 0x200019900040 with size: 0.937256 MiB 00:06:16.940 element at address: 0x200000200000 with size: 0.840942 MiB 00:06:16.940 element at address: 0x20001b000000 with size: 0.583191 MiB 00:06:16.940 element at address: 0x200019200000 with size: 0.491150 MiB 00:06:16.940 element at address: 0x200019a00000 with size: 0.485657 MiB 00:06:16.940 element at address: 0x200013800000 with size: 0.470581 MiB 00:06:16.940 element at address: 0x200028400000 with size: 0.411072 MiB 00:06:16.940 element at address: 0x200003a00000 with size: 0.356140 MiB 00:06:16.940 element at address: 0x20000b1ff040 with size: 0.001038 MiB 00:06:16.940 list of standard malloc elements. size: 199.220764 MiB 00:06:16.940 element at address: 0x20000b3fef80 with size: 132.000183 MiB 00:06:16.940 element at address: 0x2000071fef80 with size: 64.000183 MiB 00:06:16.941 element at address: 0x200018ffff80 with size: 1.000183 MiB 00:06:16.941 element at address: 0x2000193fff80 with size: 1.000183 MiB 00:06:16.941 element at address: 0x2000197fff80 with size: 1.000183 MiB 00:06:16.941 element at address: 0x2000003d9e80 with size: 0.140808 MiB 00:06:16.941 element at address: 0x2000199eff40 with size: 0.062683 MiB 00:06:16.941 element at address: 0x2000003fdf40 with size: 0.007996 MiB 00:06:16.941 element at address: 0x2000137ff040 with size: 0.000427 MiB 00:06:16.941 element at address: 0x2000137ffa00 with size: 0.000366 MiB 00:06:16.941 element at address: 0x2000002d7480 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000002d7580 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000002d7680 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000002d7900 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000002d7a00 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000002d7b00 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000003d9d80 with size: 0.000244 MiB 00:06:16.941 element at address: 0x200003aff980 with size: 0.000244 MiB 00:06:16.941 element at address: 0x200003affa80 with size: 0.000244 MiB 00:06:16.941 element at address: 0x200003eff000 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ff480 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ff580 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ff680 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ff780 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ff880 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ff980 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ffc00 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ffd00 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1ffe00 with size: 0.000244 MiB 00:06:16.941 element at address: 0x20000b1fff00 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff200 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff300 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff400 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff500 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff600 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff700 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff800 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ff900 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ffb80 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137ffc80 with size: 0.000244 MiB 00:06:16.941 element at address: 0x2000137fff00 with size: 0.000244 MiB 00:06:16.941 list of memzone associated elements. size: 602.264404 MiB 00:06:16.941 element at address: 0x20001b0954c0 with size: 211.416809 MiB 00:06:16.941 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:16.941 element at address: 0x20002846ff80 with size: 157.562622 MiB 00:06:16.941 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:16.941 element at address: 0x2000139fab40 with size: 84.020691 MiB 00:06:16.941 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_345363_0 00:06:16.941 element at address: 0x2000009ff340 with size: 48.003113 MiB 00:06:16.941 associated memzone info: size: 48.002930 MiB name: MP_evtpool_345363_0 00:06:16.941 element at address: 0x200003fff340 with size: 48.003113 MiB 00:06:16.941 associated memzone info: size: 48.002930 MiB name: MP_msgpool_345363_0 00:06:16.941 element at address: 0x200019bbe900 with size: 20.255615 MiB 00:06:16.941 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:16.941 element at address: 0x2000323feb00 with size: 18.005127 MiB 00:06:16.941 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:16.941 element at address: 0x2000005ffdc0 with size: 2.000549 MiB 00:06:16.941 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_345363 00:06:16.941 element at address: 0x200003bffdc0 with size: 2.000549 MiB 00:06:16.941 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_345363 00:06:16.941 element at address: 0x2000002d7c00 with size: 1.008179 MiB 00:06:16.941 associated memzone info: size: 1.007996 MiB name: MP_evtpool_345363 00:06:16.941 element at address: 0x2000192fde00 with size: 1.008179 MiB 00:06:16.941 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:16.941 element at address: 0x200019abc780 with size: 1.008179 MiB 00:06:16.941 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:16.941 element at address: 0x200018efde00 with size: 1.008179 MiB 00:06:16.941 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:16.941 element at address: 0x2000138f89c0 with size: 1.008179 MiB 00:06:16.941 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:16.941 element at address: 0x200003eff100 with size: 1.000549 MiB 00:06:16.941 associated memzone info: size: 1.000366 MiB name: RG_ring_0_345363 00:06:16.941 element at address: 0x200003affb80 with size: 1.000549 MiB 00:06:16.941 associated memzone info: size: 1.000366 MiB name: RG_ring_1_345363 00:06:16.941 element at address: 0x2000196ffd40 with size: 1.000549 MiB 00:06:16.941 associated memzone info: size: 1.000366 MiB name: RG_ring_4_345363 00:06:16.941 element at address: 0x2000322fe8c0 with size: 1.000549 MiB 00:06:16.941 associated memzone info: size: 1.000366 MiB name: RG_ring_5_345363 00:06:16.941 element at address: 0x200003a5b2c0 with size: 0.500549 MiB 00:06:16.941 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_345363 00:06:16.941 element at address: 0x20001927dbc0 with size: 0.500549 MiB 00:06:16.941 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:16.941 element at address: 0x200013878780 with size: 0.500549 MiB 00:06:16.941 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:16.941 element at address: 0x200019a7c540 with size: 0.250549 MiB 00:06:16.941 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:16.941 element at address: 0x200003adf740 with size: 0.125549 MiB 00:06:16.941 associated memzone info: size: 0.125366 MiB name: RG_ring_2_345363 00:06:16.941 element at address: 0x200018ef5bc0 with size: 0.031799 MiB 00:06:16.941 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:16.941 element at address: 0x2000284693c0 with size: 0.023804 MiB 00:06:16.941 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:16.941 element at address: 0x200003adb500 with size: 0.016174 MiB 00:06:16.941 associated memzone info: size: 0.015991 MiB name: RG_ring_3_345363 00:06:16.941 element at address: 0x20002846f540 with size: 0.002502 MiB 00:06:16.941 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:16.941 element at address: 0x2000002d7780 with size: 0.000366 MiB 00:06:16.941 associated memzone info: size: 0.000183 MiB name: MP_msgpool_345363 00:06:16.941 element at address: 0x2000137ffd80 with size: 0.000366 MiB 00:06:16.941 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_345363 00:06:16.941 element at address: 0x20000b1ffa80 with size: 0.000366 MiB 00:06:16.941 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:16.941 11:42:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:16.941 11:42:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 345363 00:06:16.941 11:42:07 -- common/autotest_common.sh@936 -- # '[' -z 345363 ']' 00:06:16.941 11:42:07 -- common/autotest_common.sh@940 -- # kill -0 345363 00:06:16.941 11:42:07 -- common/autotest_common.sh@941 -- # uname 00:06:16.941 11:42:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:16.941 11:42:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 345363 00:06:16.941 11:42:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:16.941 11:42:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:16.941 11:42:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 345363' 00:06:16.941 killing process with pid 345363 00:06:16.941 11:42:07 -- common/autotest_common.sh@955 -- # kill 345363 00:06:16.941 11:42:07 -- common/autotest_common.sh@960 -- # wait 345363 00:06:18.847 00:06:18.847 real 0m2.759s 00:06:18.847 user 0m2.624s 00:06:18.847 sys 0m0.597s 00:06:18.847 11:42:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:18.847 11:42:08 -- common/autotest_common.sh@10 -- # set +x 00:06:18.847 ************************************ 00:06:18.847 END TEST dpdk_mem_utility 00:06:18.847 ************************************ 00:06:18.847 11:42:09 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:18.847 11:42:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.847 11:42:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.847 11:42:09 -- common/autotest_common.sh@10 -- # set +x 00:06:18.847 ************************************ 00:06:18.847 START TEST event 00:06:18.847 ************************************ 00:06:18.847 11:42:09 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:18.847 * Looking for test storage... 00:06:18.847 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:18.847 11:42:09 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:18.847 11:42:09 -- bdev/nbd_common.sh@6 -- # set -e 00:06:18.847 11:42:09 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:18.847 11:42:09 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:18.847 11:42:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.847 11:42:09 -- common/autotest_common.sh@10 -- # set +x 00:06:19.107 ************************************ 00:06:19.107 START TEST event_perf 00:06:19.107 ************************************ 00:06:19.107 11:42:09 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:19.107 Running I/O for 1 seconds...[2024-04-18 11:42:09.500791] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:19.107 [2024-04-18 11:42:09.500865] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid345954 ] 00:06:19.107 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.107 [2024-04-18 11:42:09.639696] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:19.366 [2024-04-18 11:42:09.814055] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:19.366 [2024-04-18 11:42:09.814118] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:19.366 [2024-04-18 11:42:09.814178] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.366 [2024-04-18 11:42:09.814188] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:20.743 Running I/O for 1 seconds... 00:06:20.743 lcore 0: 184863 00:06:20.743 lcore 1: 184861 00:06:20.743 lcore 2: 184861 00:06:20.743 lcore 3: 184861 00:06:20.743 done. 00:06:20.743 00:06:20.743 real 0m1.619s 00:06:20.743 user 0m4.435s 00:06:20.743 sys 0m0.176s 00:06:20.743 11:42:11 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:20.743 11:42:11 -- common/autotest_common.sh@10 -- # set +x 00:06:20.743 ************************************ 00:06:20.743 END TEST event_perf 00:06:20.743 ************************************ 00:06:20.743 11:42:11 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:20.743 11:42:11 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:20.743 11:42:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:20.743 11:42:11 -- common/autotest_common.sh@10 -- # set +x 00:06:20.743 ************************************ 00:06:20.743 START TEST event_reactor 00:06:20.743 ************************************ 00:06:20.743 11:42:11 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:21.003 [2024-04-18 11:42:11.318121] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:21.003 [2024-04-18 11:42:11.318204] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid346165 ] 00:06:21.003 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.003 [2024-04-18 11:42:11.463178] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.261 [2024-04-18 11:42:11.634710] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.640 test_start 00:06:22.640 oneshot 00:06:22.640 tick 100 00:06:22.640 tick 100 00:06:22.640 tick 250 00:06:22.640 tick 100 00:06:22.640 tick 100 00:06:22.640 tick 100 00:06:22.640 tick 250 00:06:22.640 tick 500 00:06:22.640 tick 100 00:06:22.640 tick 100 00:06:22.640 tick 250 00:06:22.640 tick 100 00:06:22.640 tick 100 00:06:22.640 test_end 00:06:22.640 00:06:22.640 real 0m1.615s 00:06:22.640 user 0m1.430s 00:06:22.640 sys 0m0.176s 00:06:22.640 11:42:12 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:22.640 11:42:12 -- common/autotest_common.sh@10 -- # set +x 00:06:22.640 ************************************ 00:06:22.640 END TEST event_reactor 00:06:22.640 ************************************ 00:06:22.640 11:42:12 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:22.640 11:42:12 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:22.640 11:42:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:22.640 11:42:12 -- common/autotest_common.sh@10 -- # set +x 00:06:22.640 ************************************ 00:06:22.640 START TEST event_reactor_perf 00:06:22.640 ************************************ 00:06:22.640 11:42:13 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:22.640 [2024-04-18 11:42:13.111406] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:22.640 [2024-04-18 11:42:13.111517] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid346377 ] 00:06:22.899 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.899 [2024-04-18 11:42:13.253233] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.899 [2024-04-18 11:42:13.424257] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.278 test_start 00:06:24.278 test_end 00:06:24.278 Performance: 725196 events per second 00:06:24.278 00:06:24.278 real 0m1.613s 00:06:24.278 user 0m1.451s 00:06:24.278 sys 0m0.154s 00:06:24.278 11:42:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:24.278 11:42:14 -- common/autotest_common.sh@10 -- # set +x 00:06:24.278 ************************************ 00:06:24.278 END TEST event_reactor_perf 00:06:24.278 ************************************ 00:06:24.278 11:42:14 -- event/event.sh@49 -- # uname -s 00:06:24.278 11:42:14 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:24.278 11:42:14 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:24.278 11:42:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:24.278 11:42:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.278 11:42:14 -- common/autotest_common.sh@10 -- # set +x 00:06:24.537 ************************************ 00:06:24.537 START TEST event_scheduler 00:06:24.537 ************************************ 00:06:24.537 11:42:14 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:24.537 * Looking for test storage... 00:06:24.537 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:24.537 11:42:14 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:24.537 11:42:14 -- scheduler/scheduler.sh@35 -- # scheduler_pid=346771 00:06:24.537 11:42:14 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:24.537 11:42:14 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:24.537 11:42:14 -- scheduler/scheduler.sh@37 -- # waitforlisten 346771 00:06:24.537 11:42:14 -- common/autotest_common.sh@817 -- # '[' -z 346771 ']' 00:06:24.537 11:42:14 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:24.537 11:42:14 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:24.537 11:42:14 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:24.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:24.537 11:42:14 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:24.537 11:42:14 -- common/autotest_common.sh@10 -- # set +x 00:06:24.537 [2024-04-18 11:42:15.018101] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:24.537 [2024-04-18 11:42:15.018208] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid346771 ] 00:06:24.797 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.797 [2024-04-18 11:42:15.157806] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:24.797 [2024-04-18 11:42:15.333834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.797 [2024-04-18 11:42:15.333899] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:24.797 [2024-04-18 11:42:15.333945] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:24.797 [2024-04-18 11:42:15.333960] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:25.366 11:42:15 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:25.366 11:42:15 -- common/autotest_common.sh@850 -- # return 0 00:06:25.366 11:42:15 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:25.366 11:42:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.366 11:42:15 -- common/autotest_common.sh@10 -- # set +x 00:06:25.366 POWER: Env isn't set yet! 00:06:25.366 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:25.366 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:25.366 POWER: Cannot set governor of lcore 0 to userspace 00:06:25.366 POWER: Attempting to initialise PSTAT power management... 00:06:25.366 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:25.366 POWER: Initialized successfully for lcore 0 power management 00:06:25.366 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:25.366 POWER: Initialized successfully for lcore 1 power management 00:06:25.366 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:25.366 POWER: Initialized successfully for lcore 2 power management 00:06:25.366 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:25.366 POWER: Initialized successfully for lcore 3 power management 00:06:25.366 11:42:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.367 11:42:15 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:25.367 11:42:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.367 11:42:15 -- common/autotest_common.sh@10 -- # set +x 00:06:25.626 [2024-04-18 11:42:16.133618] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:25.626 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.626 11:42:16 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:25.626 11:42:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:25.626 11:42:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:25.626 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 ************************************ 00:06:25.887 START TEST scheduler_create_thread 00:06:25.887 ************************************ 00:06:25.887 11:42:16 -- common/autotest_common.sh@1111 -- # scheduler_create_thread 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 2 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 3 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 4 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 5 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 6 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 7 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 8 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 9 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 10 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:25.887 11:42:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:25.887 11:42:16 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:25.887 11:42:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:25.887 11:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:27.795 11:42:17 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:27.795 11:42:17 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:27.795 11:42:17 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:27.795 11:42:17 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:27.795 11:42:17 -- common/autotest_common.sh@10 -- # set +x 00:06:28.733 11:42:18 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:28.733 00:06:28.733 real 0m2.625s 00:06:28.733 user 0m0.024s 00:06:28.733 sys 0m0.006s 00:06:28.733 11:42:18 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:28.733 11:42:18 -- common/autotest_common.sh@10 -- # set +x 00:06:28.733 ************************************ 00:06:28.733 END TEST scheduler_create_thread 00:06:28.733 ************************************ 00:06:28.733 11:42:18 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:28.733 11:42:18 -- scheduler/scheduler.sh@46 -- # killprocess 346771 00:06:28.733 11:42:18 -- common/autotest_common.sh@936 -- # '[' -z 346771 ']' 00:06:28.733 11:42:18 -- common/autotest_common.sh@940 -- # kill -0 346771 00:06:28.733 11:42:18 -- common/autotest_common.sh@941 -- # uname 00:06:28.733 11:42:18 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:28.733 11:42:18 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 346771 00:06:28.733 11:42:19 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:28.733 11:42:19 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:28.733 11:42:19 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 346771' 00:06:28.733 killing process with pid 346771 00:06:28.733 11:42:19 -- common/autotest_common.sh@955 -- # kill 346771 00:06:28.733 11:42:19 -- common/autotest_common.sh@960 -- # wait 346771 00:06:29.023 [2024-04-18 11:42:19.404430] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:29.299 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:29.299 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:29.299 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:29.299 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:29.299 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:29.299 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:29.299 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:29.299 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:29.868 00:06:29.868 real 0m5.395s 00:06:29.868 user 0m8.678s 00:06:29.868 sys 0m0.653s 00:06:29.868 11:42:20 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:29.868 11:42:20 -- common/autotest_common.sh@10 -- # set +x 00:06:29.868 ************************************ 00:06:29.868 END TEST event_scheduler 00:06:29.868 ************************************ 00:06:29.868 11:42:20 -- event/event.sh@51 -- # modprobe -n nbd 00:06:29.868 11:42:20 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:29.868 11:42:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:29.868 11:42:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.868 11:42:20 -- common/autotest_common.sh@10 -- # set +x 00:06:30.128 ************************************ 00:06:30.128 START TEST app_repeat 00:06:30.128 ************************************ 00:06:30.128 11:42:20 -- common/autotest_common.sh@1111 -- # app_repeat_test 00:06:30.128 11:42:20 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.128 11:42:20 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.128 11:42:20 -- event/event.sh@13 -- # local nbd_list 00:06:30.128 11:42:20 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:30.128 11:42:20 -- event/event.sh@14 -- # local bdev_list 00:06:30.128 11:42:20 -- event/event.sh@15 -- # local repeat_times=4 00:06:30.128 11:42:20 -- event/event.sh@17 -- # modprobe nbd 00:06:30.128 11:42:20 -- event/event.sh@19 -- # repeat_pid=347544 00:06:30.128 11:42:20 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:30.128 11:42:20 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 347544' 00:06:30.128 Process app_repeat pid: 347544 00:06:30.128 11:42:20 -- event/event.sh@23 -- # for i in {0..2} 00:06:30.128 11:42:20 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:30.128 spdk_app_start Round 0 00:06:30.128 11:42:20 -- event/event.sh@25 -- # waitforlisten 347544 /var/tmp/spdk-nbd.sock 00:06:30.128 11:42:20 -- common/autotest_common.sh@817 -- # '[' -z 347544 ']' 00:06:30.128 11:42:20 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:30.128 11:42:20 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:30.128 11:42:20 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:30.128 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:30.128 11:42:20 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:30.128 11:42:20 -- common/autotest_common.sh@10 -- # set +x 00:06:30.128 11:42:20 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:30.128 [2024-04-18 11:42:20.463292] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:30.128 [2024-04-18 11:42:20.463383] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid347544 ] 00:06:30.128 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.128 [2024-04-18 11:42:20.605983] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:30.388 [2024-04-18 11:42:20.776985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.388 [2024-04-18 11:42:20.776999] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:30.956 11:42:21 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:30.956 11:42:21 -- common/autotest_common.sh@850 -- # return 0 00:06:30.956 11:42:21 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:30.956 Malloc0 00:06:30.956 11:42:21 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:31.215 Malloc1 00:06:31.215 11:42:21 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@12 -- # local i 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.215 11:42:21 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:31.475 /dev/nbd0 00:06:31.475 11:42:21 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:31.475 11:42:21 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:31.475 11:42:21 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:31.475 11:42:21 -- common/autotest_common.sh@855 -- # local i 00:06:31.475 11:42:21 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:31.475 11:42:21 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:31.475 11:42:21 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:31.475 11:42:21 -- common/autotest_common.sh@859 -- # break 00:06:31.475 11:42:21 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:31.475 11:42:21 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:31.475 11:42:21 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:31.475 1+0 records in 00:06:31.475 1+0 records out 00:06:31.475 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000205636 s, 19.9 MB/s 00:06:31.475 11:42:21 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:31.475 11:42:21 -- common/autotest_common.sh@872 -- # size=4096 00:06:31.475 11:42:21 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:31.475 11:42:21 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:31.475 11:42:21 -- common/autotest_common.sh@875 -- # return 0 00:06:31.475 11:42:21 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:31.475 11:42:21 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.475 11:42:21 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:31.734 /dev/nbd1 00:06:31.734 11:42:22 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:31.734 11:42:22 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:31.734 11:42:22 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:31.734 11:42:22 -- common/autotest_common.sh@855 -- # local i 00:06:31.734 11:42:22 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:31.734 11:42:22 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:31.734 11:42:22 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:31.734 11:42:22 -- common/autotest_common.sh@859 -- # break 00:06:31.734 11:42:22 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:31.734 11:42:22 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:31.734 11:42:22 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:31.734 1+0 records in 00:06:31.734 1+0 records out 00:06:31.734 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255162 s, 16.1 MB/s 00:06:31.734 11:42:22 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:31.734 11:42:22 -- common/autotest_common.sh@872 -- # size=4096 00:06:31.734 11:42:22 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:31.734 11:42:22 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:31.734 11:42:22 -- common/autotest_common.sh@875 -- # return 0 00:06:31.734 11:42:22 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:31.734 11:42:22 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:31.734 11:42:22 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:31.734 11:42:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.734 11:42:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:31.993 { 00:06:31.993 "nbd_device": "/dev/nbd0", 00:06:31.993 "bdev_name": "Malloc0" 00:06:31.993 }, 00:06:31.993 { 00:06:31.993 "nbd_device": "/dev/nbd1", 00:06:31.993 "bdev_name": "Malloc1" 00:06:31.993 } 00:06:31.993 ]' 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:31.993 { 00:06:31.993 "nbd_device": "/dev/nbd0", 00:06:31.993 "bdev_name": "Malloc0" 00:06:31.993 }, 00:06:31.993 { 00:06:31.993 "nbd_device": "/dev/nbd1", 00:06:31.993 "bdev_name": "Malloc1" 00:06:31.993 } 00:06:31.993 ]' 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:31.993 /dev/nbd1' 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:31.993 /dev/nbd1' 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@65 -- # count=2 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@95 -- # count=2 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:31.993 11:42:22 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:31.993 256+0 records in 00:06:31.993 256+0 records out 00:06:31.994 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107974 s, 97.1 MB/s 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:31.994 256+0 records in 00:06:31.994 256+0 records out 00:06:31.994 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0240061 s, 43.7 MB/s 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:31.994 256+0 records in 00:06:31.994 256+0 records out 00:06:31.994 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0227078 s, 46.2 MB/s 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@51 -- # local i 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:31.994 11:42:22 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@41 -- # break 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@45 -- # return 0 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:32.252 11:42:22 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@41 -- # break 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@45 -- # return 0 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:32.511 11:42:22 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:32.511 11:42:23 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:32.511 11:42:23 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:32.511 11:42:23 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:32.511 11:42:23 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@65 -- # true 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@65 -- # count=0 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@104 -- # count=0 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:32.770 11:42:23 -- bdev/nbd_common.sh@109 -- # return 0 00:06:32.770 11:42:23 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:33.029 11:42:23 -- event/event.sh@35 -- # sleep 3 00:06:33.967 [2024-04-18 11:42:24.279894] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:33.967 [2024-04-18 11:42:24.441169] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:33.967 [2024-04-18 11:42:24.441171] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.226 [2024-04-18 11:42:24.582264] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:34.226 [2024-04-18 11:42:24.582340] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:36.134 11:42:26 -- event/event.sh@23 -- # for i in {0..2} 00:06:36.134 11:42:26 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:36.134 spdk_app_start Round 1 00:06:36.134 11:42:26 -- event/event.sh@25 -- # waitforlisten 347544 /var/tmp/spdk-nbd.sock 00:06:36.134 11:42:26 -- common/autotest_common.sh@817 -- # '[' -z 347544 ']' 00:06:36.134 11:42:26 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:36.134 11:42:26 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:36.134 11:42:26 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:36.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:36.134 11:42:26 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:36.134 11:42:26 -- common/autotest_common.sh@10 -- # set +x 00:06:36.134 11:42:26 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:36.134 11:42:26 -- common/autotest_common.sh@850 -- # return 0 00:06:36.134 11:42:26 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:36.393 Malloc0 00:06:36.393 11:42:26 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:36.652 Malloc1 00:06:36.652 11:42:26 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@12 -- # local i 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:36.652 11:42:26 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:36.652 /dev/nbd0 00:06:36.652 11:42:27 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:36.652 11:42:27 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:36.652 11:42:27 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:36.652 11:42:27 -- common/autotest_common.sh@855 -- # local i 00:06:36.652 11:42:27 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:36.652 11:42:27 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:36.652 11:42:27 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:36.652 11:42:27 -- common/autotest_common.sh@859 -- # break 00:06:36.652 11:42:27 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:36.652 11:42:27 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:36.652 11:42:27 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:36.652 1+0 records in 00:06:36.652 1+0 records out 00:06:36.652 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227954 s, 18.0 MB/s 00:06:36.652 11:42:27 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:36.652 11:42:27 -- common/autotest_common.sh@872 -- # size=4096 00:06:36.652 11:42:27 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:36.652 11:42:27 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:36.652 11:42:27 -- common/autotest_common.sh@875 -- # return 0 00:06:36.652 11:42:27 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:36.652 11:42:27 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:36.652 11:42:27 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:36.912 /dev/nbd1 00:06:36.912 11:42:27 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:36.912 11:42:27 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:36.912 11:42:27 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:36.912 11:42:27 -- common/autotest_common.sh@855 -- # local i 00:06:36.912 11:42:27 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:36.912 11:42:27 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:36.912 11:42:27 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:36.912 11:42:27 -- common/autotest_common.sh@859 -- # break 00:06:36.912 11:42:27 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:36.912 11:42:27 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:36.912 11:42:27 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:36.912 1+0 records in 00:06:36.912 1+0 records out 00:06:36.912 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258494 s, 15.8 MB/s 00:06:36.912 11:42:27 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:36.912 11:42:27 -- common/autotest_common.sh@872 -- # size=4096 00:06:36.912 11:42:27 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:36.912 11:42:27 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:36.912 11:42:27 -- common/autotest_common.sh@875 -- # return 0 00:06:36.912 11:42:27 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:36.912 11:42:27 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:36.912 11:42:27 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:36.912 11:42:27 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:36.912 11:42:27 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:37.171 { 00:06:37.171 "nbd_device": "/dev/nbd0", 00:06:37.171 "bdev_name": "Malloc0" 00:06:37.171 }, 00:06:37.171 { 00:06:37.171 "nbd_device": "/dev/nbd1", 00:06:37.171 "bdev_name": "Malloc1" 00:06:37.171 } 00:06:37.171 ]' 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:37.171 { 00:06:37.171 "nbd_device": "/dev/nbd0", 00:06:37.171 "bdev_name": "Malloc0" 00:06:37.171 }, 00:06:37.171 { 00:06:37.171 "nbd_device": "/dev/nbd1", 00:06:37.171 "bdev_name": "Malloc1" 00:06:37.171 } 00:06:37.171 ]' 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:37.171 /dev/nbd1' 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:37.171 /dev/nbd1' 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@65 -- # count=2 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@95 -- # count=2 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:37.171 256+0 records in 00:06:37.171 256+0 records out 00:06:37.171 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00458435 s, 229 MB/s 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:37.171 256+0 records in 00:06:37.171 256+0 records out 00:06:37.171 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0183157 s, 57.3 MB/s 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:37.171 256+0 records in 00:06:37.171 256+0 records out 00:06:37.171 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0261224 s, 40.1 MB/s 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:37.171 11:42:27 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@51 -- # local i 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:37.172 11:42:27 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@41 -- # break 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@45 -- # return 0 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:37.431 11:42:27 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@41 -- # break 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@45 -- # return 0 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.690 11:42:28 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@65 -- # true 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@65 -- # count=0 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@104 -- # count=0 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:37.950 11:42:28 -- bdev/nbd_common.sh@109 -- # return 0 00:06:37.950 11:42:28 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:38.209 11:42:28 -- event/event.sh@35 -- # sleep 3 00:06:39.147 [2024-04-18 11:42:29.501587] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:39.147 [2024-04-18 11:42:29.663171] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.147 [2024-04-18 11:42:29.663177] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:39.407 [2024-04-18 11:42:29.804221] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:39.407 [2024-04-18 11:42:29.804279] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:41.312 11:42:31 -- event/event.sh@23 -- # for i in {0..2} 00:06:41.312 11:42:31 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:41.312 spdk_app_start Round 2 00:06:41.312 11:42:31 -- event/event.sh@25 -- # waitforlisten 347544 /var/tmp/spdk-nbd.sock 00:06:41.312 11:42:31 -- common/autotest_common.sh@817 -- # '[' -z 347544 ']' 00:06:41.312 11:42:31 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:41.312 11:42:31 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:41.312 11:42:31 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:41.312 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:41.312 11:42:31 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:41.312 11:42:31 -- common/autotest_common.sh@10 -- # set +x 00:06:41.312 11:42:31 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:41.312 11:42:31 -- common/autotest_common.sh@850 -- # return 0 00:06:41.312 11:42:31 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:41.571 Malloc0 00:06:41.571 11:42:31 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:41.831 Malloc1 00:06:41.831 11:42:32 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@12 -- # local i 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:41.831 11:42:32 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:41.831 /dev/nbd0 00:06:42.090 11:42:32 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:42.090 11:42:32 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:42.090 11:42:32 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:42.090 11:42:32 -- common/autotest_common.sh@855 -- # local i 00:06:42.090 11:42:32 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:42.090 11:42:32 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:42.090 11:42:32 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:42.090 11:42:32 -- common/autotest_common.sh@859 -- # break 00:06:42.090 11:42:32 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:42.090 11:42:32 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:42.090 11:42:32 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:42.090 1+0 records in 00:06:42.090 1+0 records out 00:06:42.090 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245431 s, 16.7 MB/s 00:06:42.090 11:42:32 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:42.090 11:42:32 -- common/autotest_common.sh@872 -- # size=4096 00:06:42.090 11:42:32 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:42.090 11:42:32 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:42.090 11:42:32 -- common/autotest_common.sh@875 -- # return 0 00:06:42.090 11:42:32 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:42.090 11:42:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:42.090 11:42:32 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:42.090 /dev/nbd1 00:06:42.090 11:42:32 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:42.090 11:42:32 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:42.090 11:42:32 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:42.090 11:42:32 -- common/autotest_common.sh@855 -- # local i 00:06:42.090 11:42:32 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:42.090 11:42:32 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:42.091 11:42:32 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:42.091 11:42:32 -- common/autotest_common.sh@859 -- # break 00:06:42.091 11:42:32 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:42.091 11:42:32 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:42.091 11:42:32 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:42.091 1+0 records in 00:06:42.091 1+0 records out 00:06:42.091 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000140895 s, 29.1 MB/s 00:06:42.091 11:42:32 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:42.091 11:42:32 -- common/autotest_common.sh@872 -- # size=4096 00:06:42.091 11:42:32 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:42.091 11:42:32 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:42.091 11:42:32 -- common/autotest_common.sh@875 -- # return 0 00:06:42.091 11:42:32 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:42.091 11:42:32 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:42.091 11:42:32 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:42.091 11:42:32 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:42.091 11:42:32 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:42.350 { 00:06:42.350 "nbd_device": "/dev/nbd0", 00:06:42.350 "bdev_name": "Malloc0" 00:06:42.350 }, 00:06:42.350 { 00:06:42.350 "nbd_device": "/dev/nbd1", 00:06:42.350 "bdev_name": "Malloc1" 00:06:42.350 } 00:06:42.350 ]' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:42.350 { 00:06:42.350 "nbd_device": "/dev/nbd0", 00:06:42.350 "bdev_name": "Malloc0" 00:06:42.350 }, 00:06:42.350 { 00:06:42.350 "nbd_device": "/dev/nbd1", 00:06:42.350 "bdev_name": "Malloc1" 00:06:42.350 } 00:06:42.350 ]' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:42.350 /dev/nbd1' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:42.350 /dev/nbd1' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@65 -- # count=2 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@95 -- # count=2 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:42.350 256+0 records in 00:06:42.350 256+0 records out 00:06:42.350 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00354188 s, 296 MB/s 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:42.350 256+0 records in 00:06:42.350 256+0 records out 00:06:42.350 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.023668 s, 44.3 MB/s 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:42.350 256+0 records in 00:06:42.350 256+0 records out 00:06:42.350 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0220558 s, 47.5 MB/s 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:42.350 11:42:32 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@51 -- # local i 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:42.610 11:42:32 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@41 -- # break 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@45 -- # return 0 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:42.610 11:42:33 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@41 -- # break 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@45 -- # return 0 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:42.869 11:42:33 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@65 -- # true 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@65 -- # count=0 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@104 -- # count=0 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:43.127 11:42:33 -- bdev/nbd_common.sh@109 -- # return 0 00:06:43.127 11:42:33 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:43.387 11:42:33 -- event/event.sh@35 -- # sleep 3 00:06:44.325 [2024-04-18 11:42:34.757806] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:44.584 [2024-04-18 11:42:34.926053] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:44.584 [2024-04-18 11:42:34.926057] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.584 [2024-04-18 11:42:35.064488] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:44.584 [2024-04-18 11:42:35.064558] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:46.493 11:42:36 -- event/event.sh@38 -- # waitforlisten 347544 /var/tmp/spdk-nbd.sock 00:06:46.493 11:42:36 -- common/autotest_common.sh@817 -- # '[' -z 347544 ']' 00:06:46.493 11:42:36 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:46.493 11:42:36 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:46.493 11:42:36 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:46.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:46.493 11:42:36 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:46.493 11:42:36 -- common/autotest_common.sh@10 -- # set +x 00:06:46.493 11:42:37 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:46.493 11:42:37 -- common/autotest_common.sh@850 -- # return 0 00:06:46.493 11:42:37 -- event/event.sh@39 -- # killprocess 347544 00:06:46.493 11:42:37 -- common/autotest_common.sh@936 -- # '[' -z 347544 ']' 00:06:46.493 11:42:37 -- common/autotest_common.sh@940 -- # kill -0 347544 00:06:46.493 11:42:37 -- common/autotest_common.sh@941 -- # uname 00:06:46.493 11:42:37 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:46.493 11:42:37 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 347544 00:06:46.753 11:42:37 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:46.753 11:42:37 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:46.753 11:42:37 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 347544' 00:06:46.753 killing process with pid 347544 00:06:46.753 11:42:37 -- common/autotest_common.sh@955 -- # kill 347544 00:06:46.753 11:42:37 -- common/autotest_common.sh@960 -- # wait 347544 00:06:47.321 spdk_app_start is called in Round 0. 00:06:47.321 Shutdown signal received, stop current app iteration 00:06:47.321 Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 reinitialization... 00:06:47.321 spdk_app_start is called in Round 1. 00:06:47.321 Shutdown signal received, stop current app iteration 00:06:47.321 Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 reinitialization... 00:06:47.321 spdk_app_start is called in Round 2. 00:06:47.321 Shutdown signal received, stop current app iteration 00:06:47.321 Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 reinitialization... 00:06:47.321 spdk_app_start is called in Round 3. 00:06:47.321 Shutdown signal received, stop current app iteration 00:06:47.321 11:42:37 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:47.321 11:42:37 -- event/event.sh@42 -- # return 0 00:06:47.321 00:06:47.321 real 0m17.432s 00:06:47.321 user 0m35.208s 00:06:47.321 sys 0m3.380s 00:06:47.321 11:42:37 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:47.321 11:42:37 -- common/autotest_common.sh@10 -- # set +x 00:06:47.321 ************************************ 00:06:47.322 END TEST app_repeat 00:06:47.322 ************************************ 00:06:47.581 11:42:37 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:47.581 11:42:37 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:47.581 11:42:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:47.581 11:42:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:47.581 11:42:37 -- common/autotest_common.sh@10 -- # set +x 00:06:47.581 ************************************ 00:06:47.581 START TEST cpu_locks 00:06:47.581 ************************************ 00:06:47.581 11:42:38 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:47.581 * Looking for test storage... 00:06:47.581 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:47.581 11:42:38 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:47.581 11:42:38 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:47.581 11:42:38 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:47.581 11:42:38 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:47.581 11:42:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:47.581 11:42:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:47.581 11:42:38 -- common/autotest_common.sh@10 -- # set +x 00:06:47.841 ************************************ 00:06:47.841 START TEST default_locks 00:06:47.841 ************************************ 00:06:47.841 11:42:38 -- common/autotest_common.sh@1111 -- # default_locks 00:06:47.841 11:42:38 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=350067 00:06:47.841 11:42:38 -- event/cpu_locks.sh@47 -- # waitforlisten 350067 00:06:47.841 11:42:38 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:47.841 11:42:38 -- common/autotest_common.sh@817 -- # '[' -z 350067 ']' 00:06:47.841 11:42:38 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:47.841 11:42:38 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:47.841 11:42:38 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:47.841 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:47.841 11:42:38 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:47.841 11:42:38 -- common/autotest_common.sh@10 -- # set +x 00:06:47.841 [2024-04-18 11:42:38.324570] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:47.841 [2024-04-18 11:42:38.324660] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid350067 ] 00:06:48.100 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.100 [2024-04-18 11:42:38.471496] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.100 [2024-04-18 11:42:38.639953] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.038 11:42:39 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:49.038 11:42:39 -- common/autotest_common.sh@850 -- # return 0 00:06:49.038 11:42:39 -- event/cpu_locks.sh@49 -- # locks_exist 350067 00:06:49.038 11:42:39 -- event/cpu_locks.sh@22 -- # lslocks -p 350067 00:06:49.038 11:42:39 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:49.607 lslocks: write error 00:06:49.607 11:42:39 -- event/cpu_locks.sh@50 -- # killprocess 350067 00:06:49.607 11:42:39 -- common/autotest_common.sh@936 -- # '[' -z 350067 ']' 00:06:49.607 11:42:39 -- common/autotest_common.sh@940 -- # kill -0 350067 00:06:49.607 11:42:39 -- common/autotest_common.sh@941 -- # uname 00:06:49.607 11:42:39 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:49.607 11:42:39 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 350067 00:06:49.608 11:42:39 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:49.608 11:42:39 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:49.608 11:42:39 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 350067' 00:06:49.608 killing process with pid 350067 00:06:49.608 11:42:39 -- common/autotest_common.sh@955 -- # kill 350067 00:06:49.608 11:42:39 -- common/autotest_common.sh@960 -- # wait 350067 00:06:50.987 11:42:41 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 350067 00:06:50.987 11:42:41 -- common/autotest_common.sh@638 -- # local es=0 00:06:50.987 11:42:41 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 350067 00:06:50.987 11:42:41 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:06:50.987 11:42:41 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:50.987 11:42:41 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:06:50.987 11:42:41 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:50.987 11:42:41 -- common/autotest_common.sh@641 -- # waitforlisten 350067 00:06:50.987 11:42:41 -- common/autotest_common.sh@817 -- # '[' -z 350067 ']' 00:06:50.987 11:42:41 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.987 11:42:41 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:50.987 11:42:41 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.987 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.987 11:42:41 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:50.987 11:42:41 -- common/autotest_common.sh@10 -- # set +x 00:06:50.987 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (350067) - No such process 00:06:50.987 ERROR: process (pid: 350067) is no longer running 00:06:50.987 11:42:41 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:50.987 11:42:41 -- common/autotest_common.sh@850 -- # return 1 00:06:50.987 11:42:41 -- common/autotest_common.sh@641 -- # es=1 00:06:50.987 11:42:41 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:50.987 11:42:41 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:50.987 11:42:41 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:50.987 11:42:41 -- event/cpu_locks.sh@54 -- # no_locks 00:06:50.987 11:42:41 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:50.987 11:42:41 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:50.987 11:42:41 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:50.987 00:06:50.987 real 0m3.197s 00:06:50.987 user 0m3.109s 00:06:50.987 sys 0m0.894s 00:06:50.987 11:42:41 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:50.987 11:42:41 -- common/autotest_common.sh@10 -- # set +x 00:06:50.987 ************************************ 00:06:50.987 END TEST default_locks 00:06:50.987 ************************************ 00:06:50.987 11:42:41 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:50.987 11:42:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:50.987 11:42:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:50.987 11:42:41 -- common/autotest_common.sh@10 -- # set +x 00:06:51.246 ************************************ 00:06:51.246 START TEST default_locks_via_rpc 00:06:51.246 ************************************ 00:06:51.246 11:42:41 -- common/autotest_common.sh@1111 -- # default_locks_via_rpc 00:06:51.246 11:42:41 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=350629 00:06:51.246 11:42:41 -- event/cpu_locks.sh@63 -- # waitforlisten 350629 00:06:51.246 11:42:41 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:51.246 11:42:41 -- common/autotest_common.sh@817 -- # '[' -z 350629 ']' 00:06:51.246 11:42:41 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:51.246 11:42:41 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:51.246 11:42:41 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:51.246 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:51.246 11:42:41 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:51.246 11:42:41 -- common/autotest_common.sh@10 -- # set +x 00:06:51.246 [2024-04-18 11:42:41.716304] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:51.246 [2024-04-18 11:42:41.716403] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid350629 ] 00:06:51.504 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.504 [2024-04-18 11:42:41.861112] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.504 [2024-04-18 11:42:42.029444] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.073 11:42:42 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:52.073 11:42:42 -- common/autotest_common.sh@850 -- # return 0 00:06:52.073 11:42:42 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:52.073 11:42:42 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:52.073 11:42:42 -- common/autotest_common.sh@10 -- # set +x 00:06:52.332 11:42:42 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:52.332 11:42:42 -- event/cpu_locks.sh@67 -- # no_locks 00:06:52.332 11:42:42 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:52.332 11:42:42 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:52.332 11:42:42 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:52.332 11:42:42 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:52.332 11:42:42 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:52.332 11:42:42 -- common/autotest_common.sh@10 -- # set +x 00:06:52.332 11:42:42 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:52.332 11:42:42 -- event/cpu_locks.sh@71 -- # locks_exist 350629 00:06:52.332 11:42:42 -- event/cpu_locks.sh@22 -- # lslocks -p 350629 00:06:52.332 11:42:42 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:52.592 11:42:43 -- event/cpu_locks.sh@73 -- # killprocess 350629 00:06:52.592 11:42:43 -- common/autotest_common.sh@936 -- # '[' -z 350629 ']' 00:06:52.592 11:42:43 -- common/autotest_common.sh@940 -- # kill -0 350629 00:06:52.592 11:42:43 -- common/autotest_common.sh@941 -- # uname 00:06:52.592 11:42:43 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:52.592 11:42:43 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 350629 00:06:52.592 11:42:43 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:52.592 11:42:43 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:52.592 11:42:43 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 350629' 00:06:52.592 killing process with pid 350629 00:06:52.592 11:42:43 -- common/autotest_common.sh@955 -- # kill 350629 00:06:52.592 11:42:43 -- common/autotest_common.sh@960 -- # wait 350629 00:06:54.498 00:06:54.498 real 0m2.976s 00:06:54.498 user 0m2.872s 00:06:54.498 sys 0m0.744s 00:06:54.498 11:42:44 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:54.498 11:42:44 -- common/autotest_common.sh@10 -- # set +x 00:06:54.498 ************************************ 00:06:54.498 END TEST default_locks_via_rpc 00:06:54.498 ************************************ 00:06:54.498 11:42:44 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:54.498 11:42:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:54.498 11:42:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.498 11:42:44 -- common/autotest_common.sh@10 -- # set +x 00:06:54.498 ************************************ 00:06:54.498 START TEST non_locking_app_on_locked_coremask 00:06:54.498 ************************************ 00:06:54.498 11:42:44 -- common/autotest_common.sh@1111 -- # non_locking_app_on_locked_coremask 00:06:54.498 11:42:44 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=351025 00:06:54.498 11:42:44 -- event/cpu_locks.sh@81 -- # waitforlisten 351025 /var/tmp/spdk.sock 00:06:54.498 11:42:44 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:54.498 11:42:44 -- common/autotest_common.sh@817 -- # '[' -z 351025 ']' 00:06:54.498 11:42:44 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:54.498 11:42:44 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:54.498 11:42:44 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:54.498 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:54.498 11:42:44 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:54.498 11:42:44 -- common/autotest_common.sh@10 -- # set +x 00:06:54.498 [2024-04-18 11:42:44.877576] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:54.498 [2024-04-18 11:42:44.877664] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid351025 ] 00:06:54.498 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.498 [2024-04-18 11:42:45.014351] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.758 [2024-04-18 11:42:45.181601] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.414 11:42:45 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:55.414 11:42:45 -- common/autotest_common.sh@850 -- # return 0 00:06:55.414 11:42:45 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:55.414 11:42:45 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=351197 00:06:55.414 11:42:45 -- event/cpu_locks.sh@85 -- # waitforlisten 351197 /var/tmp/spdk2.sock 00:06:55.414 11:42:45 -- common/autotest_common.sh@817 -- # '[' -z 351197 ']' 00:06:55.414 11:42:45 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:55.414 11:42:45 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:55.414 11:42:45 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:55.414 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:55.414 11:42:45 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:55.414 11:42:45 -- common/autotest_common.sh@10 -- # set +x 00:06:55.414 [2024-04-18 11:42:45.826672] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:06:55.414 [2024-04-18 11:42:45.826773] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid351197 ] 00:06:55.414 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.673 [2024-04-18 11:42:46.019798] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:55.673 [2024-04-18 11:42:46.019846] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.931 [2024-04-18 11:42:46.355988] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.308 11:42:47 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:57.308 11:42:47 -- common/autotest_common.sh@850 -- # return 0 00:06:57.308 11:42:47 -- event/cpu_locks.sh@87 -- # locks_exist 351025 00:06:57.308 11:42:47 -- event/cpu_locks.sh@22 -- # lslocks -p 351025 00:06:57.308 11:42:47 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:58.243 lslocks: write error 00:06:58.243 11:42:48 -- event/cpu_locks.sh@89 -- # killprocess 351025 00:06:58.243 11:42:48 -- common/autotest_common.sh@936 -- # '[' -z 351025 ']' 00:06:58.243 11:42:48 -- common/autotest_common.sh@940 -- # kill -0 351025 00:06:58.243 11:42:48 -- common/autotest_common.sh@941 -- # uname 00:06:58.243 11:42:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:58.243 11:42:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 351025 00:06:58.243 11:42:48 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:58.243 11:42:48 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:58.243 11:42:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 351025' 00:06:58.243 killing process with pid 351025 00:06:58.243 11:42:48 -- common/autotest_common.sh@955 -- # kill 351025 00:06:58.243 11:42:48 -- common/autotest_common.sh@960 -- # wait 351025 00:07:01.534 11:42:51 -- event/cpu_locks.sh@90 -- # killprocess 351197 00:07:01.534 11:42:51 -- common/autotest_common.sh@936 -- # '[' -z 351197 ']' 00:07:01.534 11:42:51 -- common/autotest_common.sh@940 -- # kill -0 351197 00:07:01.534 11:42:51 -- common/autotest_common.sh@941 -- # uname 00:07:01.534 11:42:51 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:01.534 11:42:51 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 351197 00:07:01.534 11:42:51 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:01.534 11:42:51 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:01.534 11:42:51 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 351197' 00:07:01.534 killing process with pid 351197 00:07:01.534 11:42:51 -- common/autotest_common.sh@955 -- # kill 351197 00:07:01.534 11:42:51 -- common/autotest_common.sh@960 -- # wait 351197 00:07:02.914 00:07:02.914 real 0m8.575s 00:07:02.914 user 0m8.600s 00:07:02.914 sys 0m1.556s 00:07:02.914 11:42:53 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:02.914 11:42:53 -- common/autotest_common.sh@10 -- # set +x 00:07:02.914 ************************************ 00:07:02.914 END TEST non_locking_app_on_locked_coremask 00:07:02.914 ************************************ 00:07:02.914 11:42:53 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:02.914 11:42:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:02.914 11:42:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.914 11:42:53 -- common/autotest_common.sh@10 -- # set +x 00:07:03.173 ************************************ 00:07:03.173 START TEST locking_app_on_unlocked_coremask 00:07:03.173 ************************************ 00:07:03.173 11:42:53 -- common/autotest_common.sh@1111 -- # locking_app_on_unlocked_coremask 00:07:03.173 11:42:53 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=352222 00:07:03.173 11:42:53 -- event/cpu_locks.sh@99 -- # waitforlisten 352222 /var/tmp/spdk.sock 00:07:03.173 11:42:53 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:03.173 11:42:53 -- common/autotest_common.sh@817 -- # '[' -z 352222 ']' 00:07:03.173 11:42:53 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:03.173 11:42:53 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:03.174 11:42:53 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:03.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:03.174 11:42:53 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:03.174 11:42:53 -- common/autotest_common.sh@10 -- # set +x 00:07:03.174 [2024-04-18 11:42:53.644792] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:03.174 [2024-04-18 11:42:53.644870] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid352222 ] 00:07:03.432 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.432 [2024-04-18 11:42:53.789153] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:03.432 [2024-04-18 11:42:53.789197] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.432 [2024-04-18 11:42:53.956209] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.001 11:42:54 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:04.001 11:42:54 -- common/autotest_common.sh@850 -- # return 0 00:07:04.001 11:42:54 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=352305 00:07:04.001 11:42:54 -- event/cpu_locks.sh@103 -- # waitforlisten 352305 /var/tmp/spdk2.sock 00:07:04.001 11:42:54 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:04.001 11:42:54 -- common/autotest_common.sh@817 -- # '[' -z 352305 ']' 00:07:04.001 11:42:54 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:04.001 11:42:54 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:04.001 11:42:54 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:04.001 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:04.001 11:42:54 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:04.001 11:42:54 -- common/autotest_common.sh@10 -- # set +x 00:07:04.260 [2024-04-18 11:42:54.581718] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:04.260 [2024-04-18 11:42:54.581827] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid352305 ] 00:07:04.260 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.260 [2024-04-18 11:42:54.781562] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.829 [2024-04-18 11:42:55.126078] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.767 11:42:56 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:05.767 11:42:56 -- common/autotest_common.sh@850 -- # return 0 00:07:05.767 11:42:56 -- event/cpu_locks.sh@105 -- # locks_exist 352305 00:07:05.767 11:42:56 -- event/cpu_locks.sh@22 -- # lslocks -p 352305 00:07:05.767 11:42:56 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:07.148 lslocks: write error 00:07:07.148 11:42:57 -- event/cpu_locks.sh@107 -- # killprocess 352222 00:07:07.148 11:42:57 -- common/autotest_common.sh@936 -- # '[' -z 352222 ']' 00:07:07.148 11:42:57 -- common/autotest_common.sh@940 -- # kill -0 352222 00:07:07.148 11:42:57 -- common/autotest_common.sh@941 -- # uname 00:07:07.148 11:42:57 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:07.148 11:42:57 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 352222 00:07:07.148 11:42:57 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:07.148 11:42:57 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:07.148 11:42:57 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 352222' 00:07:07.148 killing process with pid 352222 00:07:07.148 11:42:57 -- common/autotest_common.sh@955 -- # kill 352222 00:07:07.148 11:42:57 -- common/autotest_common.sh@960 -- # wait 352222 00:07:10.440 11:43:00 -- event/cpu_locks.sh@108 -- # killprocess 352305 00:07:10.440 11:43:00 -- common/autotest_common.sh@936 -- # '[' -z 352305 ']' 00:07:10.440 11:43:00 -- common/autotest_common.sh@940 -- # kill -0 352305 00:07:10.440 11:43:00 -- common/autotest_common.sh@941 -- # uname 00:07:10.440 11:43:00 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:10.440 11:43:00 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 352305 00:07:10.440 11:43:00 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:10.440 11:43:00 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:10.440 11:43:00 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 352305' 00:07:10.440 killing process with pid 352305 00:07:10.440 11:43:00 -- common/autotest_common.sh@955 -- # kill 352305 00:07:10.440 11:43:00 -- common/autotest_common.sh@960 -- # wait 352305 00:07:11.821 00:07:11.821 real 0m8.461s 00:07:11.821 user 0m8.431s 00:07:11.821 sys 0m1.522s 00:07:11.821 11:43:02 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:11.821 11:43:02 -- common/autotest_common.sh@10 -- # set +x 00:07:11.821 ************************************ 00:07:11.821 END TEST locking_app_on_unlocked_coremask 00:07:11.821 ************************************ 00:07:11.821 11:43:02 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:11.821 11:43:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:11.821 11:43:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:11.821 11:43:02 -- common/autotest_common.sh@10 -- # set +x 00:07:11.821 ************************************ 00:07:11.822 START TEST locking_app_on_locked_coremask 00:07:11.822 ************************************ 00:07:11.822 11:43:02 -- common/autotest_common.sh@1111 -- # locking_app_on_locked_coremask 00:07:11.822 11:43:02 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=353399 00:07:11.822 11:43:02 -- event/cpu_locks.sh@116 -- # waitforlisten 353399 /var/tmp/spdk.sock 00:07:11.822 11:43:02 -- common/autotest_common.sh@817 -- # '[' -z 353399 ']' 00:07:11.822 11:43:02 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:11.822 11:43:02 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:11.822 11:43:02 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:11.822 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:11.822 11:43:02 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:11.822 11:43:02 -- common/autotest_common.sh@10 -- # set +x 00:07:11.822 11:43:02 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:11.822 [2024-04-18 11:43:02.285179] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:11.822 [2024-04-18 11:43:02.285260] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid353399 ] 00:07:11.822 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.081 [2024-04-18 11:43:02.426830] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.081 [2024-04-18 11:43:02.598249] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.651 11:43:03 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:12.651 11:43:03 -- common/autotest_common.sh@850 -- # return 0 00:07:12.651 11:43:03 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=353530 00:07:12.651 11:43:03 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 353530 /var/tmp/spdk2.sock 00:07:12.651 11:43:03 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:12.651 11:43:03 -- common/autotest_common.sh@638 -- # local es=0 00:07:12.651 11:43:03 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 353530 /var/tmp/spdk2.sock 00:07:12.651 11:43:03 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:07:12.651 11:43:03 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:12.651 11:43:03 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:07:12.910 11:43:03 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:12.910 11:43:03 -- common/autotest_common.sh@641 -- # waitforlisten 353530 /var/tmp/spdk2.sock 00:07:12.910 11:43:03 -- common/autotest_common.sh@817 -- # '[' -z 353530 ']' 00:07:12.910 11:43:03 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:12.910 11:43:03 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:12.910 11:43:03 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:12.910 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:12.910 11:43:03 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:12.910 11:43:03 -- common/autotest_common.sh@10 -- # set +x 00:07:12.910 [2024-04-18 11:43:03.224589] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:12.910 [2024-04-18 11:43:03.224676] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid353530 ] 00:07:12.910 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.910 [2024-04-18 11:43:03.423746] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 353399 has claimed it. 00:07:12.910 [2024-04-18 11:43:03.423802] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:13.478 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (353530) - No such process 00:07:13.478 ERROR: process (pid: 353530) is no longer running 00:07:13.478 11:43:03 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:13.478 11:43:03 -- common/autotest_common.sh@850 -- # return 1 00:07:13.478 11:43:03 -- common/autotest_common.sh@641 -- # es=1 00:07:13.478 11:43:03 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:13.478 11:43:03 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:07:13.478 11:43:03 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:13.478 11:43:03 -- event/cpu_locks.sh@122 -- # locks_exist 353399 00:07:13.478 11:43:03 -- event/cpu_locks.sh@22 -- # lslocks -p 353399 00:07:13.478 11:43:03 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:13.738 lslocks: write error 00:07:13.738 11:43:04 -- event/cpu_locks.sh@124 -- # killprocess 353399 00:07:13.738 11:43:04 -- common/autotest_common.sh@936 -- # '[' -z 353399 ']' 00:07:13.738 11:43:04 -- common/autotest_common.sh@940 -- # kill -0 353399 00:07:13.738 11:43:04 -- common/autotest_common.sh@941 -- # uname 00:07:13.738 11:43:04 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:13.738 11:43:04 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 353399 00:07:13.738 11:43:04 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:13.738 11:43:04 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:13.738 11:43:04 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 353399' 00:07:13.738 killing process with pid 353399 00:07:13.738 11:43:04 -- common/autotest_common.sh@955 -- # kill 353399 00:07:13.738 11:43:04 -- common/autotest_common.sh@960 -- # wait 353399 00:07:15.646 00:07:15.646 real 0m3.574s 00:07:15.646 user 0m3.564s 00:07:15.646 sys 0m0.864s 00:07:15.646 11:43:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:15.646 11:43:05 -- common/autotest_common.sh@10 -- # set +x 00:07:15.646 ************************************ 00:07:15.646 END TEST locking_app_on_locked_coremask 00:07:15.646 ************************************ 00:07:15.646 11:43:05 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:15.646 11:43:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:15.647 11:43:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:15.647 11:43:05 -- common/autotest_common.sh@10 -- # set +x 00:07:15.647 ************************************ 00:07:15.647 START TEST locking_overlapped_coremask 00:07:15.647 ************************************ 00:07:15.647 11:43:06 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask 00:07:15.647 11:43:06 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=353961 00:07:15.647 11:43:06 -- event/cpu_locks.sh@133 -- # waitforlisten 353961 /var/tmp/spdk.sock 00:07:15.647 11:43:06 -- common/autotest_common.sh@817 -- # '[' -z 353961 ']' 00:07:15.647 11:43:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:15.647 11:43:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:15.647 11:43:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:15.647 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:15.647 11:43:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:15.647 11:43:06 -- common/autotest_common.sh@10 -- # set +x 00:07:15.647 11:43:06 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:15.647 [2024-04-18 11:43:06.049932] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:15.647 [2024-04-18 11:43:06.050020] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid353961 ] 00:07:15.647 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.647 [2024-04-18 11:43:06.194787] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:15.906 [2024-04-18 11:43:06.366346] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:15.906 [2024-04-18 11:43:06.366389] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.906 [2024-04-18 11:43:06.366401] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:16.474 11:43:06 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:16.474 11:43:06 -- common/autotest_common.sh@850 -- # return 0 00:07:16.474 11:43:06 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=353986 00:07:16.474 11:43:06 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 353986 /var/tmp/spdk2.sock 00:07:16.474 11:43:06 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:16.474 11:43:06 -- common/autotest_common.sh@638 -- # local es=0 00:07:16.474 11:43:06 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 353986 /var/tmp/spdk2.sock 00:07:16.474 11:43:06 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:07:16.474 11:43:06 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:16.474 11:43:06 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:07:16.474 11:43:06 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:16.475 11:43:06 -- common/autotest_common.sh@641 -- # waitforlisten 353986 /var/tmp/spdk2.sock 00:07:16.475 11:43:06 -- common/autotest_common.sh@817 -- # '[' -z 353986 ']' 00:07:16.475 11:43:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:16.475 11:43:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:16.475 11:43:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:16.475 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:16.475 11:43:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:16.475 11:43:06 -- common/autotest_common.sh@10 -- # set +x 00:07:16.733 [2024-04-18 11:43:07.030661] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:16.733 [2024-04-18 11:43:07.030748] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid353986 ] 00:07:16.733 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.733 [2024-04-18 11:43:07.234112] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 353961 has claimed it. 00:07:16.733 [2024-04-18 11:43:07.234166] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:17.301 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (353986) - No such process 00:07:17.301 ERROR: process (pid: 353986) is no longer running 00:07:17.301 11:43:07 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:17.301 11:43:07 -- common/autotest_common.sh@850 -- # return 1 00:07:17.301 11:43:07 -- common/autotest_common.sh@641 -- # es=1 00:07:17.301 11:43:07 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:17.301 11:43:07 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:07:17.301 11:43:07 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:17.301 11:43:07 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:17.301 11:43:07 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:17.301 11:43:07 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:17.301 11:43:07 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:17.301 11:43:07 -- event/cpu_locks.sh@141 -- # killprocess 353961 00:07:17.301 11:43:07 -- common/autotest_common.sh@936 -- # '[' -z 353961 ']' 00:07:17.301 11:43:07 -- common/autotest_common.sh@940 -- # kill -0 353961 00:07:17.301 11:43:07 -- common/autotest_common.sh@941 -- # uname 00:07:17.301 11:43:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:17.301 11:43:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 353961 00:07:17.301 11:43:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:17.301 11:43:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:17.301 11:43:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 353961' 00:07:17.301 killing process with pid 353961 00:07:17.301 11:43:07 -- common/autotest_common.sh@955 -- # kill 353961 00:07:17.301 11:43:07 -- common/autotest_common.sh@960 -- # wait 353961 00:07:19.209 00:07:19.209 real 0m3.302s 00:07:19.209 user 0m8.565s 00:07:19.209 sys 0m0.726s 00:07:19.209 11:43:09 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:19.209 11:43:09 -- common/autotest_common.sh@10 -- # set +x 00:07:19.209 ************************************ 00:07:19.209 END TEST locking_overlapped_coremask 00:07:19.209 ************************************ 00:07:19.209 11:43:09 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:19.209 11:43:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:19.209 11:43:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:19.209 11:43:09 -- common/autotest_common.sh@10 -- # set +x 00:07:19.209 ************************************ 00:07:19.209 START TEST locking_overlapped_coremask_via_rpc 00:07:19.209 ************************************ 00:07:19.209 11:43:09 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask_via_rpc 00:07:19.209 11:43:09 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=354369 00:07:19.209 11:43:09 -- event/cpu_locks.sh@149 -- # waitforlisten 354369 /var/tmp/spdk.sock 00:07:19.209 11:43:09 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:19.209 11:43:09 -- common/autotest_common.sh@817 -- # '[' -z 354369 ']' 00:07:19.209 11:43:09 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.209 11:43:09 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:19.209 11:43:09 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.209 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.209 11:43:09 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:19.209 11:43:09 -- common/autotest_common.sh@10 -- # set +x 00:07:19.209 [2024-04-18 11:43:09.555870] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:19.209 [2024-04-18 11:43:09.555969] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid354369 ] 00:07:19.209 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.209 [2024-04-18 11:43:09.701266] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:19.209 [2024-04-18 11:43:09.701310] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:19.468 [2024-04-18 11:43:09.876093] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:19.468 [2024-04-18 11:43:09.876153] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.468 [2024-04-18 11:43:09.876163] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:20.037 11:43:10 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:20.037 11:43:10 -- common/autotest_common.sh@850 -- # return 0 00:07:20.037 11:43:10 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=354547 00:07:20.037 11:43:10 -- event/cpu_locks.sh@153 -- # waitforlisten 354547 /var/tmp/spdk2.sock 00:07:20.037 11:43:10 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:20.037 11:43:10 -- common/autotest_common.sh@817 -- # '[' -z 354547 ']' 00:07:20.037 11:43:10 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:20.037 11:43:10 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:20.037 11:43:10 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:20.037 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:20.037 11:43:10 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:20.037 11:43:10 -- common/autotest_common.sh@10 -- # set +x 00:07:20.037 [2024-04-18 11:43:10.547060] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:20.037 [2024-04-18 11:43:10.547160] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid354547 ] 00:07:20.297 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.297 [2024-04-18 11:43:10.747641] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:20.297 [2024-04-18 11:43:10.747692] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:20.866 [2024-04-18 11:43:11.109111] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:20.866 [2024-04-18 11:43:11.109172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:20.866 [2024-04-18 11:43:11.109200] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:07:21.804 11:43:12 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:21.804 11:43:12 -- common/autotest_common.sh@850 -- # return 0 00:07:21.804 11:43:12 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:21.804 11:43:12 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:21.804 11:43:12 -- common/autotest_common.sh@10 -- # set +x 00:07:21.804 11:43:12 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:21.804 11:43:12 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:21.804 11:43:12 -- common/autotest_common.sh@638 -- # local es=0 00:07:21.804 11:43:12 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:21.804 11:43:12 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:07:21.804 11:43:12 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:21.804 11:43:12 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:07:21.804 11:43:12 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:21.804 11:43:12 -- common/autotest_common.sh@641 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:21.805 11:43:12 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:21.805 11:43:12 -- common/autotest_common.sh@10 -- # set +x 00:07:21.805 [2024-04-18 11:43:12.341520] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 354369 has claimed it. 00:07:21.805 request: 00:07:21.805 { 00:07:21.805 "method": "framework_enable_cpumask_locks", 00:07:21.805 "req_id": 1 00:07:21.805 } 00:07:21.805 Got JSON-RPC error response 00:07:21.805 response: 00:07:21.805 { 00:07:21.805 "code": -32603, 00:07:21.805 "message": "Failed to claim CPU core: 2" 00:07:21.805 } 00:07:21.805 11:43:12 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:07:21.805 11:43:12 -- common/autotest_common.sh@641 -- # es=1 00:07:21.805 11:43:12 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:21.805 11:43:12 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:07:21.805 11:43:12 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:21.805 11:43:12 -- event/cpu_locks.sh@158 -- # waitforlisten 354369 /var/tmp/spdk.sock 00:07:21.805 11:43:12 -- common/autotest_common.sh@817 -- # '[' -z 354369 ']' 00:07:21.805 11:43:12 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.805 11:43:12 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:21.805 11:43:12 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.805 11:43:12 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:21.805 11:43:12 -- common/autotest_common.sh@10 -- # set +x 00:07:22.064 11:43:12 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:22.064 11:43:12 -- common/autotest_common.sh@850 -- # return 0 00:07:22.064 11:43:12 -- event/cpu_locks.sh@159 -- # waitforlisten 354547 /var/tmp/spdk2.sock 00:07:22.064 11:43:12 -- common/autotest_common.sh@817 -- # '[' -z 354547 ']' 00:07:22.064 11:43:12 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:22.064 11:43:12 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:22.064 11:43:12 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:22.064 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:22.064 11:43:12 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:22.064 11:43:12 -- common/autotest_common.sh@10 -- # set +x 00:07:22.324 11:43:12 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:22.324 11:43:12 -- common/autotest_common.sh@850 -- # return 0 00:07:22.324 11:43:12 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:22.324 11:43:12 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:22.324 11:43:12 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:22.324 11:43:12 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:22.324 00:07:22.324 real 0m3.225s 00:07:22.324 user 0m0.853s 00:07:22.324 sys 0m0.229s 00:07:22.324 11:43:12 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:22.324 11:43:12 -- common/autotest_common.sh@10 -- # set +x 00:07:22.324 ************************************ 00:07:22.324 END TEST locking_overlapped_coremask_via_rpc 00:07:22.324 ************************************ 00:07:22.324 11:43:12 -- event/cpu_locks.sh@174 -- # cleanup 00:07:22.324 11:43:12 -- event/cpu_locks.sh@15 -- # [[ -z 354369 ]] 00:07:22.324 11:43:12 -- event/cpu_locks.sh@15 -- # killprocess 354369 00:07:22.324 11:43:12 -- common/autotest_common.sh@936 -- # '[' -z 354369 ']' 00:07:22.324 11:43:12 -- common/autotest_common.sh@940 -- # kill -0 354369 00:07:22.324 11:43:12 -- common/autotest_common.sh@941 -- # uname 00:07:22.324 11:43:12 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:22.324 11:43:12 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 354369 00:07:22.324 11:43:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:22.324 11:43:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:22.324 11:43:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 354369' 00:07:22.324 killing process with pid 354369 00:07:22.324 11:43:12 -- common/autotest_common.sh@955 -- # kill 354369 00:07:22.324 11:43:12 -- common/autotest_common.sh@960 -- # wait 354369 00:07:24.230 11:43:14 -- event/cpu_locks.sh@16 -- # [[ -z 354547 ]] 00:07:24.230 11:43:14 -- event/cpu_locks.sh@16 -- # killprocess 354547 00:07:24.230 11:43:14 -- common/autotest_common.sh@936 -- # '[' -z 354547 ']' 00:07:24.230 11:43:14 -- common/autotest_common.sh@940 -- # kill -0 354547 00:07:24.230 11:43:14 -- common/autotest_common.sh@941 -- # uname 00:07:24.230 11:43:14 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:24.230 11:43:14 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 354547 00:07:24.230 11:43:14 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:07:24.230 11:43:14 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:07:24.230 11:43:14 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 354547' 00:07:24.230 killing process with pid 354547 00:07:24.230 11:43:14 -- common/autotest_common.sh@955 -- # kill 354547 00:07:24.230 11:43:14 -- common/autotest_common.sh@960 -- # wait 354547 00:07:25.610 11:43:16 -- event/cpu_locks.sh@18 -- # rm -f 00:07:25.610 11:43:16 -- event/cpu_locks.sh@1 -- # cleanup 00:07:25.610 11:43:16 -- event/cpu_locks.sh@15 -- # [[ -z 354369 ]] 00:07:25.610 11:43:16 -- event/cpu_locks.sh@15 -- # killprocess 354369 00:07:25.610 11:43:16 -- common/autotest_common.sh@936 -- # '[' -z 354369 ']' 00:07:25.610 11:43:16 -- common/autotest_common.sh@940 -- # kill -0 354369 00:07:25.610 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (354369) - No such process 00:07:25.610 11:43:16 -- common/autotest_common.sh@963 -- # echo 'Process with pid 354369 is not found' 00:07:25.610 Process with pid 354369 is not found 00:07:25.610 11:43:16 -- event/cpu_locks.sh@16 -- # [[ -z 354547 ]] 00:07:25.610 11:43:16 -- event/cpu_locks.sh@16 -- # killprocess 354547 00:07:25.610 11:43:16 -- common/autotest_common.sh@936 -- # '[' -z 354547 ']' 00:07:25.610 11:43:16 -- common/autotest_common.sh@940 -- # kill -0 354547 00:07:25.610 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (354547) - No such process 00:07:25.610 11:43:16 -- common/autotest_common.sh@963 -- # echo 'Process with pid 354547 is not found' 00:07:25.610 Process with pid 354547 is not found 00:07:25.610 11:43:16 -- event/cpu_locks.sh@18 -- # rm -f 00:07:25.610 00:07:25.610 real 0m38.086s 00:07:25.610 user 1m0.586s 00:07:25.610 sys 0m8.274s 00:07:25.610 11:43:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:25.610 11:43:16 -- common/autotest_common.sh@10 -- # set +x 00:07:25.610 ************************************ 00:07:25.610 END TEST cpu_locks 00:07:25.610 ************************************ 00:07:25.610 00:07:25.610 real 1m6.947s 00:07:25.610 user 1m52.188s 00:07:25.610 sys 0m13.508s 00:07:25.610 11:43:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:25.610 11:43:16 -- common/autotest_common.sh@10 -- # set +x 00:07:25.610 ************************************ 00:07:25.610 END TEST event 00:07:25.610 ************************************ 00:07:25.870 11:43:16 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:25.870 11:43:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:25.870 11:43:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:25.870 11:43:16 -- common/autotest_common.sh@10 -- # set +x 00:07:25.870 ************************************ 00:07:25.870 START TEST thread 00:07:25.870 ************************************ 00:07:25.870 11:43:16 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:25.870 * Looking for test storage... 00:07:26.160 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:26.160 11:43:16 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:26.160 11:43:16 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:26.160 11:43:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:26.160 11:43:16 -- common/autotest_common.sh@10 -- # set +x 00:07:26.160 ************************************ 00:07:26.160 START TEST thread_poller_perf 00:07:26.161 ************************************ 00:07:26.161 11:43:16 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:26.161 [2024-04-18 11:43:16.600710] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:26.161 [2024-04-18 11:43:16.600795] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid355382 ] 00:07:26.161 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.448 [2024-04-18 11:43:16.742761] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.448 [2024-04-18 11:43:16.912010] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.448 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:27.828 ====================================== 00:07:27.828 busy:2305901710 (cyc) 00:07:27.828 total_run_count: 773000 00:07:27.828 tsc_hz: 2300000000 (cyc) 00:07:27.828 ====================================== 00:07:27.828 poller_cost: 2983 (cyc), 1296 (nsec) 00:07:27.828 00:07:27.828 real 0m1.612s 00:07:27.828 user 0m1.440s 00:07:27.828 sys 0m0.164s 00:07:27.828 11:43:18 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:27.828 11:43:18 -- common/autotest_common.sh@10 -- # set +x 00:07:27.828 ************************************ 00:07:27.828 END TEST thread_poller_perf 00:07:27.828 ************************************ 00:07:27.828 11:43:18 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:27.828 11:43:18 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:27.828 11:43:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:27.828 11:43:18 -- common/autotest_common.sh@10 -- # set +x 00:07:27.828 ************************************ 00:07:27.828 START TEST thread_poller_perf 00:07:27.828 ************************************ 00:07:27.828 11:43:18 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:28.087 [2024-04-18 11:43:18.392810] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:28.087 [2024-04-18 11:43:18.392913] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid355746 ] 00:07:28.087 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.087 [2024-04-18 11:43:18.528755] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.347 [2024-04-18 11:43:18.695129] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.347 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:29.726 ====================================== 00:07:29.726 busy:2302451774 (cyc) 00:07:29.726 total_run_count: 12626000 00:07:29.726 tsc_hz: 2300000000 (cyc) 00:07:29.726 ====================================== 00:07:29.726 poller_cost: 182 (cyc), 79 (nsec) 00:07:29.726 00:07:29.726 real 0m1.593s 00:07:29.726 user 0m1.429s 00:07:29.726 sys 0m0.158s 00:07:29.726 11:43:19 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:29.726 11:43:19 -- common/autotest_common.sh@10 -- # set +x 00:07:29.726 ************************************ 00:07:29.726 END TEST thread_poller_perf 00:07:29.726 ************************************ 00:07:29.726 11:43:19 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:29.726 11:43:19 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:29.726 11:43:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:29.726 11:43:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.726 11:43:19 -- common/autotest_common.sh@10 -- # set +x 00:07:29.726 ************************************ 00:07:29.726 START TEST thread_spdk_lock 00:07:29.726 ************************************ 00:07:29.726 11:43:20 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:29.726 [2024-04-18 11:43:20.157857] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:29.726 [2024-04-18 11:43:20.157951] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid355949 ] 00:07:29.726 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.984 [2024-04-18 11:43:20.298465] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:29.984 [2024-04-18 11:43:20.471786] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.984 [2024-04-18 11:43:20.471797] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:30.551 [2024-04-18 11:43:20.965653] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.552 [2024-04-18 11:43:20.965712] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:30.552 [2024-04-18 11:43:20.965743] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x1bad760 00:07:30.552 [2024-04-18 11:43:20.971781] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.552 [2024-04-18 11:43:20.971881] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.552 [2024-04-18 11:43:20.971908] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.810 Starting test contend 00:07:30.810 Worker Delay Wait us Hold us Total us 00:07:30.810 0 3 160131 187450 347582 00:07:30.810 1 5 82702 286714 369417 00:07:30.810 PASS test contend 00:07:30.810 Starting test hold_by_poller 00:07:30.810 PASS test hold_by_poller 00:07:30.810 Starting test hold_by_message 00:07:30.810 PASS test hold_by_message 00:07:30.810 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:30.810 100014 assertions passed 00:07:30.810 0 assertions failed 00:07:30.810 00:07:30.810 real 0m1.106s 00:07:30.810 user 0m1.422s 00:07:30.810 sys 0m0.177s 00:07:30.810 11:43:21 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:30.810 11:43:21 -- common/autotest_common.sh@10 -- # set +x 00:07:30.810 ************************************ 00:07:30.810 END TEST thread_spdk_lock 00:07:30.810 ************************************ 00:07:30.810 00:07:30.810 real 0m4.953s 00:07:30.810 user 0m4.522s 00:07:30.810 sys 0m0.862s 00:07:30.810 11:43:21 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:30.810 11:43:21 -- common/autotest_common.sh@10 -- # set +x 00:07:30.810 ************************************ 00:07:30.810 END TEST thread 00:07:30.810 ************************************ 00:07:30.810 11:43:21 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:07:30.810 11:43:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:30.810 11:43:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.810 11:43:21 -- common/autotest_common.sh@10 -- # set +x 00:07:31.069 ************************************ 00:07:31.069 START TEST accel 00:07:31.069 ************************************ 00:07:31.069 11:43:21 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:07:31.069 * Looking for test storage... 00:07:31.069 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:31.069 11:43:21 -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:31.069 11:43:21 -- accel/accel.sh@82 -- # get_expected_opcs 00:07:31.069 11:43:21 -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:31.069 11:43:21 -- accel/accel.sh@62 -- # spdk_tgt_pid=356197 00:07:31.069 11:43:21 -- accel/accel.sh@63 -- # waitforlisten 356197 00:07:31.069 11:43:21 -- common/autotest_common.sh@817 -- # '[' -z 356197 ']' 00:07:31.069 11:43:21 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.069 11:43:21 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:31.069 11:43:21 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.069 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.069 11:43:21 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:31.069 11:43:21 -- common/autotest_common.sh@10 -- # set +x 00:07:31.069 11:43:21 -- accel/accel.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:31.069 11:43:21 -- accel/accel.sh@61 -- # build_accel_config 00:07:31.069 11:43:21 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.069 11:43:21 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.069 11:43:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.069 11:43:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.069 11:43:21 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:31.069 11:43:21 -- accel/accel.sh@40 -- # local IFS=, 00:07:31.069 11:43:21 -- accel/accel.sh@41 -- # jq -r . 00:07:31.069 [2024-04-18 11:43:21.609060] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:31.069 [2024-04-18 11:43:21.609144] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid356197 ] 00:07:31.327 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.327 [2024-04-18 11:43:21.752544] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.586 [2024-04-18 11:43:21.921168] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.153 11:43:22 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:32.153 11:43:22 -- common/autotest_common.sh@850 -- # return 0 00:07:32.153 11:43:22 -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:32.153 11:43:22 -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:32.153 11:43:22 -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:32.153 11:43:22 -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:32.153 11:43:22 -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:32.153 11:43:22 -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:32.153 11:43:22 -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:32.153 11:43:22 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:32.153 11:43:22 -- common/autotest_common.sh@10 -- # set +x 00:07:32.153 11:43:22 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:32.153 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.153 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.153 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.153 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.153 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.153 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.153 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.153 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.153 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.153 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # IFS== 00:07:32.154 11:43:22 -- accel/accel.sh@72 -- # read -r opc module 00:07:32.154 11:43:22 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:32.154 11:43:22 -- accel/accel.sh@75 -- # killprocess 356197 00:07:32.154 11:43:22 -- common/autotest_common.sh@936 -- # '[' -z 356197 ']' 00:07:32.154 11:43:22 -- common/autotest_common.sh@940 -- # kill -0 356197 00:07:32.154 11:43:22 -- common/autotest_common.sh@941 -- # uname 00:07:32.154 11:43:22 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:32.154 11:43:22 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 356197 00:07:32.154 11:43:22 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:32.154 11:43:22 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:32.154 11:43:22 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 356197' 00:07:32.154 killing process with pid 356197 00:07:32.154 11:43:22 -- common/autotest_common.sh@955 -- # kill 356197 00:07:32.154 11:43:22 -- common/autotest_common.sh@960 -- # wait 356197 00:07:34.058 11:43:24 -- accel/accel.sh@76 -- # trap - ERR 00:07:34.058 11:43:24 -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:34.058 11:43:24 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:07:34.058 11:43:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:34.058 11:43:24 -- common/autotest_common.sh@10 -- # set +x 00:07:34.058 11:43:24 -- common/autotest_common.sh@1111 -- # accel_perf -h 00:07:34.058 11:43:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:34.058 11:43:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:34.058 11:43:24 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:34.058 11:43:24 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:34.058 11:43:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.058 11:43:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.058 11:43:24 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:34.058 11:43:24 -- accel/accel.sh@40 -- # local IFS=, 00:07:34.058 11:43:24 -- accel/accel.sh@41 -- # jq -r . 00:07:34.058 11:43:24 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:34.058 11:43:24 -- common/autotest_common.sh@10 -- # set +x 00:07:34.058 11:43:24 -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:34.058 11:43:24 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:34.058 11:43:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:34.058 11:43:24 -- common/autotest_common.sh@10 -- # set +x 00:07:34.058 ************************************ 00:07:34.058 START TEST accel_missing_filename 00:07:34.058 ************************************ 00:07:34.058 11:43:24 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress 00:07:34.058 11:43:24 -- common/autotest_common.sh@638 -- # local es=0 00:07:34.058 11:43:24 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:34.058 11:43:24 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:07:34.058 11:43:24 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:34.058 11:43:24 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:07:34.058 11:43:24 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:34.058 11:43:24 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress 00:07:34.058 11:43:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:34.058 11:43:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:34.058 11:43:24 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:34.058 11:43:24 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:34.058 11:43:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.058 11:43:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.058 11:43:24 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:34.058 11:43:24 -- accel/accel.sh@40 -- # local IFS=, 00:07:34.058 11:43:24 -- accel/accel.sh@41 -- # jq -r . 00:07:34.058 [2024-04-18 11:43:24.583491] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:34.058 [2024-04-18 11:43:24.583580] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid356605 ] 00:07:34.317 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.317 [2024-04-18 11:43:24.734802] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.576 [2024-04-18 11:43:24.900935] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.576 [2024-04-18 11:43:25.046871] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:35.145 [2024-04-18 11:43:25.391156] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:07:35.145 A filename is required. 00:07:35.146 11:43:25 -- common/autotest_common.sh@641 -- # es=234 00:07:35.146 11:43:25 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:35.146 11:43:25 -- common/autotest_common.sh@650 -- # es=106 00:07:35.146 11:43:25 -- common/autotest_common.sh@651 -- # case "$es" in 00:07:35.146 11:43:25 -- common/autotest_common.sh@658 -- # es=1 00:07:35.146 11:43:25 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:35.146 00:07:35.146 real 0m1.114s 00:07:35.146 user 0m0.902s 00:07:35.146 sys 0m0.243s 00:07:35.146 11:43:25 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:35.146 11:43:25 -- common/autotest_common.sh@10 -- # set +x 00:07:35.146 ************************************ 00:07:35.146 END TEST accel_missing_filename 00:07:35.146 ************************************ 00:07:35.404 11:43:25 -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.404 11:43:25 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:07:35.404 11:43:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:35.404 11:43:25 -- common/autotest_common.sh@10 -- # set +x 00:07:35.404 ************************************ 00:07:35.404 START TEST accel_compress_verify 00:07:35.404 ************************************ 00:07:35.404 11:43:25 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.404 11:43:25 -- common/autotest_common.sh@638 -- # local es=0 00:07:35.404 11:43:25 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.404 11:43:25 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:07:35.404 11:43:25 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:35.404 11:43:25 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:07:35.404 11:43:25 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:35.404 11:43:25 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.404 11:43:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:35.404 11:43:25 -- accel/accel.sh@12 -- # build_accel_config 00:07:35.404 11:43:25 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:35.404 11:43:25 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:35.404 11:43:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:35.404 11:43:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:35.404 11:43:25 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:35.404 11:43:25 -- accel/accel.sh@40 -- # local IFS=, 00:07:35.404 11:43:25 -- accel/accel.sh@41 -- # jq -r . 00:07:35.404 [2024-04-18 11:43:25.901066] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:35.404 [2024-04-18 11:43:25.901149] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid356805 ] 00:07:35.664 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.664 [2024-04-18 11:43:26.046000] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.924 [2024-04-18 11:43:26.220768] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.924 [2024-04-18 11:43:26.371192] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:36.183 [2024-04-18 11:43:26.713330] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:07:36.443 00:07:36.443 Compression does not support the verify option, aborting. 00:07:36.443 11:43:26 -- common/autotest_common.sh@641 -- # es=161 00:07:36.443 11:43:26 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:36.443 11:43:26 -- common/autotest_common.sh@650 -- # es=33 00:07:36.443 11:43:26 -- common/autotest_common.sh@651 -- # case "$es" in 00:07:36.443 11:43:26 -- common/autotest_common.sh@658 -- # es=1 00:07:36.443 11:43:26 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:36.443 00:07:36.443 real 0m1.121s 00:07:36.443 user 0m0.909s 00:07:36.443 sys 0m0.248s 00:07:36.443 11:43:26 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:36.443 11:43:26 -- common/autotest_common.sh@10 -- # set +x 00:07:36.443 ************************************ 00:07:36.443 END TEST accel_compress_verify 00:07:36.443 ************************************ 00:07:36.703 11:43:27 -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:36.703 11:43:27 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:36.703 11:43:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:36.703 11:43:27 -- common/autotest_common.sh@10 -- # set +x 00:07:36.703 ************************************ 00:07:36.703 START TEST accel_wrong_workload 00:07:36.703 ************************************ 00:07:36.703 11:43:27 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w foobar 00:07:36.703 11:43:27 -- common/autotest_common.sh@638 -- # local es=0 00:07:36.703 11:43:27 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:36.703 11:43:27 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:07:36.703 11:43:27 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:36.703 11:43:27 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:07:36.703 11:43:27 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:36.703 11:43:27 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w foobar 00:07:36.703 11:43:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:36.703 11:43:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:36.703 11:43:27 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.703 11:43:27 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.703 11:43:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.703 11:43:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.703 11:43:27 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:36.703 11:43:27 -- accel/accel.sh@40 -- # local IFS=, 00:07:36.703 11:43:27 -- accel/accel.sh@41 -- # jq -r . 00:07:36.703 Unsupported workload type: foobar 00:07:36.703 [2024-04-18 11:43:27.204508] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:36.703 accel_perf options: 00:07:36.703 [-h help message] 00:07:36.703 [-q queue depth per core] 00:07:36.703 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:36.703 [-T number of threads per core 00:07:36.703 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:36.703 [-t time in seconds] 00:07:36.703 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:36.703 [ dif_verify, , dif_generate, dif_generate_copy 00:07:36.703 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:36.703 [-l for compress/decompress workloads, name of uncompressed input file 00:07:36.703 [-S for crc32c workload, use this seed value (default 0) 00:07:36.703 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:36.703 [-f for fill workload, use this BYTE value (default 255) 00:07:36.703 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:36.703 [-y verify result if this switch is on] 00:07:36.703 [-a tasks to allocate per core (default: same value as -q)] 00:07:36.703 Can be used to spread operations across a wider range of memory. 00:07:36.703 11:43:27 -- common/autotest_common.sh@641 -- # es=1 00:07:36.703 11:43:27 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:36.703 11:43:27 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:07:36.703 11:43:27 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:36.703 00:07:36.703 real 0m0.061s 00:07:36.703 user 0m0.065s 00:07:36.703 sys 0m0.036s 00:07:36.703 11:43:27 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:36.703 11:43:27 -- common/autotest_common.sh@10 -- # set +x 00:07:36.703 ************************************ 00:07:36.703 END TEST accel_wrong_workload 00:07:36.703 ************************************ 00:07:36.963 11:43:27 -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:36.963 11:43:27 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:07:36.963 11:43:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:36.963 11:43:27 -- common/autotest_common.sh@10 -- # set +x 00:07:36.963 ************************************ 00:07:36.963 START TEST accel_negative_buffers 00:07:36.963 ************************************ 00:07:36.963 11:43:27 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:36.963 11:43:27 -- common/autotest_common.sh@638 -- # local es=0 00:07:36.963 11:43:27 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:36.963 11:43:27 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:07:36.963 11:43:27 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:36.963 11:43:27 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:07:36.963 11:43:27 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:36.963 11:43:27 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w xor -y -x -1 00:07:36.963 11:43:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:36.963 11:43:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:36.963 11:43:27 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.963 11:43:27 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.963 11:43:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.963 11:43:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.963 11:43:27 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:36.963 11:43:27 -- accel/accel.sh@40 -- # local IFS=, 00:07:36.963 11:43:27 -- accel/accel.sh@41 -- # jq -r . 00:07:36.963 -x option must be non-negative. 00:07:36.963 [2024-04-18 11:43:27.453719] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:36.963 accel_perf options: 00:07:36.963 [-h help message] 00:07:36.963 [-q queue depth per core] 00:07:36.963 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:36.963 [-T number of threads per core 00:07:36.963 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:36.963 [-t time in seconds] 00:07:36.963 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:36.963 [ dif_verify, , dif_generate, dif_generate_copy 00:07:36.963 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:36.963 [-l for compress/decompress workloads, name of uncompressed input file 00:07:36.963 [-S for crc32c workload, use this seed value (default 0) 00:07:36.963 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:36.963 [-f for fill workload, use this BYTE value (default 255) 00:07:36.963 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:36.963 [-y verify result if this switch is on] 00:07:36.963 [-a tasks to allocate per core (default: same value as -q)] 00:07:36.963 Can be used to spread operations across a wider range of memory. 00:07:36.963 11:43:27 -- common/autotest_common.sh@641 -- # es=1 00:07:36.963 11:43:27 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:36.963 11:43:27 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:07:36.963 11:43:27 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:36.963 00:07:36.963 real 0m0.053s 00:07:36.963 user 0m0.057s 00:07:36.963 sys 0m0.035s 00:07:36.963 11:43:27 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:36.963 11:43:27 -- common/autotest_common.sh@10 -- # set +x 00:07:36.963 ************************************ 00:07:36.963 END TEST accel_negative_buffers 00:07:36.963 ************************************ 00:07:36.963 11:43:27 -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:36.963 11:43:27 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:36.963 11:43:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:36.963 11:43:27 -- common/autotest_common.sh@10 -- # set +x 00:07:37.223 ************************************ 00:07:37.223 START TEST accel_crc32c 00:07:37.223 ************************************ 00:07:37.223 11:43:27 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:37.223 11:43:27 -- accel/accel.sh@16 -- # local accel_opc 00:07:37.223 11:43:27 -- accel/accel.sh@17 -- # local accel_module 00:07:37.223 11:43:27 -- accel/accel.sh@19 -- # IFS=: 00:07:37.223 11:43:27 -- accel/accel.sh@19 -- # read -r var val 00:07:37.223 11:43:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:37.223 11:43:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:37.223 11:43:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:37.223 11:43:27 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:37.223 11:43:27 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:37.223 11:43:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:37.223 11:43:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:37.223 11:43:27 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:37.223 11:43:27 -- accel/accel.sh@40 -- # local IFS=, 00:07:37.223 11:43:27 -- accel/accel.sh@41 -- # jq -r . 00:07:37.223 [2024-04-18 11:43:27.695483] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:37.223 [2024-04-18 11:43:27.695576] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid357175 ] 00:07:37.223 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.483 [2024-04-18 11:43:27.836633] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.483 [2024-04-18 11:43:28.004501] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val= 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val= 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=0x1 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val= 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val= 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=crc32c 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=32 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val= 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=software 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@22 -- # accel_module=software 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=32 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=32 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=1 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val=Yes 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val= 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:37.743 11:43:28 -- accel/accel.sh@20 -- # val= 00:07:37.743 11:43:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # IFS=: 00:07:37.743 11:43:28 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@20 -- # val= 00:07:39.651 11:43:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # IFS=: 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@20 -- # val= 00:07:39.651 11:43:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # IFS=: 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@20 -- # val= 00:07:39.651 11:43:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # IFS=: 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@20 -- # val= 00:07:39.651 11:43:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # IFS=: 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@20 -- # val= 00:07:39.651 11:43:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # IFS=: 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@20 -- # val= 00:07:39.651 11:43:29 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # IFS=: 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:39.651 11:43:29 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:39.651 11:43:29 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:39.651 00:07:39.651 real 0m2.103s 00:07:39.651 user 0m1.864s 00:07:39.651 sys 0m0.236s 00:07:39.651 11:43:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:39.651 11:43:29 -- common/autotest_common.sh@10 -- # set +x 00:07:39.651 ************************************ 00:07:39.651 END TEST accel_crc32c 00:07:39.651 ************************************ 00:07:39.651 11:43:29 -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:39.651 11:43:29 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:39.651 11:43:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:39.651 11:43:29 -- common/autotest_common.sh@10 -- # set +x 00:07:39.651 ************************************ 00:07:39.651 START TEST accel_crc32c_C2 00:07:39.651 ************************************ 00:07:39.651 11:43:29 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:39.651 11:43:29 -- accel/accel.sh@16 -- # local accel_opc 00:07:39.651 11:43:29 -- accel/accel.sh@17 -- # local accel_module 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # IFS=: 00:07:39.651 11:43:29 -- accel/accel.sh@19 -- # read -r var val 00:07:39.651 11:43:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:39.651 11:43:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:39.651 11:43:29 -- accel/accel.sh@12 -- # build_accel_config 00:07:39.651 11:43:29 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.651 11:43:29 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.651 11:43:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.651 11:43:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.651 11:43:29 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.651 11:43:29 -- accel/accel.sh@40 -- # local IFS=, 00:07:39.651 11:43:29 -- accel/accel.sh@41 -- # jq -r . 00:07:39.651 [2024-04-18 11:43:29.964916] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:39.651 [2024-04-18 11:43:29.965000] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid357444 ] 00:07:39.651 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.651 [2024-04-18 11:43:30.107748] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.911 [2024-04-18 11:43:30.279802] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val= 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val= 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=0x1 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val= 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val= 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=crc32c 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=0 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val= 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=software 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@22 -- # accel_module=software 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=32 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=32 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=1 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val=Yes 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val= 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:39.911 11:43:30 -- accel/accel.sh@20 -- # val= 00:07:39.911 11:43:30 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # IFS=: 00:07:39.911 11:43:30 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:41.818 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:41.818 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:41.818 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:41.818 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:41.818 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:41.818 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:41.818 11:43:32 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:41.818 11:43:32 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:41.818 00:07:41.818 real 0m2.112s 00:07:41.818 user 0m1.875s 00:07:41.818 sys 0m0.236s 00:07:41.818 11:43:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:41.818 11:43:32 -- common/autotest_common.sh@10 -- # set +x 00:07:41.818 ************************************ 00:07:41.818 END TEST accel_crc32c_C2 00:07:41.818 ************************************ 00:07:41.818 11:43:32 -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:41.818 11:43:32 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:41.818 11:43:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:41.818 11:43:32 -- common/autotest_common.sh@10 -- # set +x 00:07:41.818 ************************************ 00:07:41.818 START TEST accel_copy 00:07:41.818 ************************************ 00:07:41.818 11:43:32 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy -y 00:07:41.818 11:43:32 -- accel/accel.sh@16 -- # local accel_opc 00:07:41.818 11:43:32 -- accel/accel.sh@17 -- # local accel_module 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:41.818 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:41.818 11:43:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:41.818 11:43:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:41.818 11:43:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:41.818 11:43:32 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.818 11:43:32 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.818 11:43:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.818 11:43:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.818 11:43:32 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.818 11:43:32 -- accel/accel.sh@40 -- # local IFS=, 00:07:41.818 11:43:32 -- accel/accel.sh@41 -- # jq -r . 00:07:41.818 [2024-04-18 11:43:32.270848] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:41.818 [2024-04-18 11:43:32.270928] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid357813 ] 00:07:41.818 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.078 [2024-04-18 11:43:32.409040] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.078 [2024-04-18 11:43:32.574623] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.337 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:42.337 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.337 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.337 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.337 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:42.337 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.337 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.337 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.337 11:43:32 -- accel/accel.sh@20 -- # val=0x1 00:07:42.337 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.337 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val=copy 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@23 -- # accel_opc=copy 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val=software 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@22 -- # accel_module=software 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val=32 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val=32 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val=1 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val=Yes 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:42.338 11:43:32 -- accel/accel.sh@20 -- # val= 00:07:42.338 11:43:32 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # IFS=: 00:07:42.338 11:43:32 -- accel/accel.sh@19 -- # read -r var val 00:07:44.244 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.245 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.245 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.245 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.245 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.245 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.245 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.245 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.245 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.245 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.245 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.245 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.245 11:43:34 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.245 11:43:34 -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:44.245 11:43:34 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.245 00:07:44.245 real 0m2.091s 00:07:44.245 user 0m1.856s 00:07:44.245 sys 0m0.232s 00:07:44.245 11:43:34 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:44.245 11:43:34 -- common/autotest_common.sh@10 -- # set +x 00:07:44.245 ************************************ 00:07:44.245 END TEST accel_copy 00:07:44.245 ************************************ 00:07:44.245 11:43:34 -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.245 11:43:34 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:44.245 11:43:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:44.245 11:43:34 -- common/autotest_common.sh@10 -- # set +x 00:07:44.245 ************************************ 00:07:44.245 START TEST accel_fill 00:07:44.245 ************************************ 00:07:44.245 11:43:34 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.245 11:43:34 -- accel/accel.sh@16 -- # local accel_opc 00:07:44.245 11:43:34 -- accel/accel.sh@17 -- # local accel_module 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.245 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.245 11:43:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.245 11:43:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:44.245 11:43:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:44.245 11:43:34 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.245 11:43:34 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.245 11:43:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.245 11:43:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.245 11:43:34 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.245 11:43:34 -- accel/accel.sh@40 -- # local IFS=, 00:07:44.245 11:43:34 -- accel/accel.sh@41 -- # jq -r . 00:07:44.245 [2024-04-18 11:43:34.547279] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:44.245 [2024-04-18 11:43:34.547351] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358144 ] 00:07:44.245 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.245 [2024-04-18 11:43:34.686615] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.504 [2024-04-18 11:43:34.854321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val=0x1 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val=fill 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.504 11:43:34 -- accel/accel.sh@23 -- # accel_opc=fill 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val=0x80 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.504 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.504 11:43:34 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.504 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:34 -- accel/accel.sh@20 -- # val= 00:07:44.505 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:34 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:34 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:34 -- accel/accel.sh@20 -- # val=software 00:07:44.505 11:43:34 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:34 -- accel/accel.sh@22 -- # accel_module=software 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:35 -- accel/accel.sh@20 -- # val=64 00:07:44.505 11:43:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:35 -- accel/accel.sh@20 -- # val=64 00:07:44.505 11:43:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:35 -- accel/accel.sh@20 -- # val=1 00:07:44.505 11:43:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:35 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.505 11:43:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:35 -- accel/accel.sh@20 -- # val=Yes 00:07:44.505 11:43:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:35 -- accel/accel.sh@20 -- # val= 00:07:44.505 11:43:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:44.505 11:43:35 -- accel/accel.sh@20 -- # val= 00:07:44.505 11:43:35 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # IFS=: 00:07:44.505 11:43:35 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@20 -- # val= 00:07:46.410 11:43:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # IFS=: 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@20 -- # val= 00:07:46.410 11:43:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # IFS=: 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@20 -- # val= 00:07:46.410 11:43:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # IFS=: 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@20 -- # val= 00:07:46.410 11:43:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # IFS=: 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@20 -- # val= 00:07:46.410 11:43:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # IFS=: 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@20 -- # val= 00:07:46.410 11:43:36 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # IFS=: 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:46.410 11:43:36 -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:46.410 11:43:36 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:46.410 00:07:46.410 real 0m2.086s 00:07:46.410 user 0m1.863s 00:07:46.410 sys 0m0.224s 00:07:46.410 11:43:36 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:46.410 11:43:36 -- common/autotest_common.sh@10 -- # set +x 00:07:46.410 ************************************ 00:07:46.410 END TEST accel_fill 00:07:46.410 ************************************ 00:07:46.410 11:43:36 -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:46.410 11:43:36 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:46.410 11:43:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:46.410 11:43:36 -- common/autotest_common.sh@10 -- # set +x 00:07:46.410 ************************************ 00:07:46.410 START TEST accel_copy_crc32c 00:07:46.410 ************************************ 00:07:46.410 11:43:36 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y 00:07:46.410 11:43:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:46.410 11:43:36 -- accel/accel.sh@17 -- # local accel_module 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # IFS=: 00:07:46.410 11:43:36 -- accel/accel.sh@19 -- # read -r var val 00:07:46.410 11:43:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:46.410 11:43:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:46.410 11:43:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:46.410 11:43:36 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.410 11:43:36 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.410 11:43:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.410 11:43:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.410 11:43:36 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.410 11:43:36 -- accel/accel.sh@40 -- # local IFS=, 00:07:46.410 11:43:36 -- accel/accel.sh@41 -- # jq -r . 00:07:46.410 [2024-04-18 11:43:36.807194] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:46.410 [2024-04-18 11:43:36.807272] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358401 ] 00:07:46.410 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.410 [2024-04-18 11:43:36.951122] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.671 [2024-04-18 11:43:37.121016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val= 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val= 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val=0x1 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val= 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val= 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val=0 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val= 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.930 11:43:37 -- accel/accel.sh@20 -- # val=software 00:07:46.930 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.930 11:43:37 -- accel/accel.sh@22 -- # accel_module=software 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.930 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.931 11:43:37 -- accel/accel.sh@20 -- # val=32 00:07:46.931 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.931 11:43:37 -- accel/accel.sh@20 -- # val=32 00:07:46.931 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.931 11:43:37 -- accel/accel.sh@20 -- # val=1 00:07:46.931 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.931 11:43:37 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:46.931 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.931 11:43:37 -- accel/accel.sh@20 -- # val=Yes 00:07:46.931 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.931 11:43:37 -- accel/accel.sh@20 -- # val= 00:07:46.931 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:46.931 11:43:37 -- accel/accel.sh@20 -- # val= 00:07:46.931 11:43:37 -- accel/accel.sh@21 -- # case "$var" in 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # IFS=: 00:07:46.931 11:43:37 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:38 -- accel/accel.sh@20 -- # val= 00:07:48.837 11:43:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # IFS=: 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:38 -- accel/accel.sh@20 -- # val= 00:07:48.837 11:43:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # IFS=: 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:38 -- accel/accel.sh@20 -- # val= 00:07:48.837 11:43:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # IFS=: 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:38 -- accel/accel.sh@20 -- # val= 00:07:48.837 11:43:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # IFS=: 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:38 -- accel/accel.sh@20 -- # val= 00:07:48.837 11:43:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # IFS=: 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:38 -- accel/accel.sh@20 -- # val= 00:07:48.837 11:43:38 -- accel/accel.sh@21 -- # case "$var" in 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # IFS=: 00:07:48.837 11:43:38 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:38 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:48.837 11:43:38 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:48.837 11:43:38 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:48.837 00:07:48.837 real 0m2.131s 00:07:48.837 user 0m1.887s 00:07:48.837 sys 0m0.245s 00:07:48.837 11:43:38 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:48.837 11:43:38 -- common/autotest_common.sh@10 -- # set +x 00:07:48.837 ************************************ 00:07:48.837 END TEST accel_copy_crc32c 00:07:48.837 ************************************ 00:07:48.837 11:43:38 -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:48.837 11:43:38 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:48.837 11:43:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:48.837 11:43:38 -- common/autotest_common.sh@10 -- # set +x 00:07:48.837 ************************************ 00:07:48.837 START TEST accel_copy_crc32c_C2 00:07:48.837 ************************************ 00:07:48.837 11:43:39 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:48.837 11:43:39 -- accel/accel.sh@16 -- # local accel_opc 00:07:48.837 11:43:39 -- accel/accel.sh@17 -- # local accel_module 00:07:48.837 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:48.837 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:48.837 11:43:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:48.837 11:43:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:48.837 11:43:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:48.837 11:43:39 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:48.837 11:43:39 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:48.837 11:43:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:48.837 11:43:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:48.837 11:43:39 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:48.837 11:43:39 -- accel/accel.sh@40 -- # local IFS=, 00:07:48.837 11:43:39 -- accel/accel.sh@41 -- # jq -r . 00:07:48.837 [2024-04-18 11:43:39.125817] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:48.837 [2024-04-18 11:43:39.125886] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid358772 ] 00:07:48.837 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.837 [2024-04-18 11:43:39.264294] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.096 [2024-04-18 11:43:39.432629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val= 00:07:49.096 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val= 00:07:49.096 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val=0x1 00:07:49.096 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val= 00:07:49.096 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val= 00:07:49.096 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:49.096 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.096 11:43:39 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val=0 00:07:49.096 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.096 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.096 11:43:39 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val= 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val=software 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@22 -- # accel_module=software 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val=32 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val=32 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val=1 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val=Yes 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val= 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:49.097 11:43:39 -- accel/accel.sh@20 -- # val= 00:07:49.097 11:43:39 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # IFS=: 00:07:49.097 11:43:39 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.004 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.004 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.004 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.004 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.004 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.004 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:51.004 11:43:41 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:51.004 11:43:41 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:51.004 00:07:51.004 real 0m2.117s 00:07:51.004 user 0m1.871s 00:07:51.004 sys 0m0.246s 00:07:51.004 11:43:41 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:51.004 11:43:41 -- common/autotest_common.sh@10 -- # set +x 00:07:51.004 ************************************ 00:07:51.004 END TEST accel_copy_crc32c_C2 00:07:51.004 ************************************ 00:07:51.004 11:43:41 -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:51.004 11:43:41 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:51.004 11:43:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:51.004 11:43:41 -- common/autotest_common.sh@10 -- # set +x 00:07:51.004 ************************************ 00:07:51.004 START TEST accel_dualcast 00:07:51.004 ************************************ 00:07:51.004 11:43:41 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dualcast -y 00:07:51.004 11:43:41 -- accel/accel.sh@16 -- # local accel_opc 00:07:51.004 11:43:41 -- accel/accel.sh@17 -- # local accel_module 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.004 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.004 11:43:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:51.004 11:43:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:51.004 11:43:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:51.004 11:43:41 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:51.004 11:43:41 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:51.004 11:43:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:51.004 11:43:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:51.004 11:43:41 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:51.004 11:43:41 -- accel/accel.sh@40 -- # local IFS=, 00:07:51.004 11:43:41 -- accel/accel.sh@41 -- # jq -r . 00:07:51.004 [2024-04-18 11:43:41.435108] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:51.004 [2024-04-18 11:43:41.435184] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359144 ] 00:07:51.004 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.263 [2024-04-18 11:43:41.576605] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.263 [2024-04-18 11:43:41.744378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val=0x1 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val=dualcast 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val=software 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@22 -- # accel_module=software 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val=32 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val=32 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val=1 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val=Yes 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:51.522 11:43:41 -- accel/accel.sh@20 -- # val= 00:07:51.522 11:43:41 -- accel/accel.sh@21 -- # case "$var" in 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # IFS=: 00:07:51.522 11:43:41 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@20 -- # val= 00:07:53.428 11:43:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # IFS=: 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@20 -- # val= 00:07:53.428 11:43:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # IFS=: 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@20 -- # val= 00:07:53.428 11:43:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # IFS=: 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@20 -- # val= 00:07:53.428 11:43:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # IFS=: 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@20 -- # val= 00:07:53.428 11:43:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # IFS=: 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@20 -- # val= 00:07:53.428 11:43:43 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # IFS=: 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:53.428 11:43:43 -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:53.428 11:43:43 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:53.428 00:07:53.428 real 0m2.103s 00:07:53.428 user 0m1.872s 00:07:53.428 sys 0m0.232s 00:07:53.428 11:43:43 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:53.428 11:43:43 -- common/autotest_common.sh@10 -- # set +x 00:07:53.428 ************************************ 00:07:53.428 END TEST accel_dualcast 00:07:53.428 ************************************ 00:07:53.428 11:43:43 -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:53.428 11:43:43 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:53.428 11:43:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:53.428 11:43:43 -- common/autotest_common.sh@10 -- # set +x 00:07:53.428 ************************************ 00:07:53.428 START TEST accel_compare 00:07:53.428 ************************************ 00:07:53.428 11:43:43 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compare -y 00:07:53.428 11:43:43 -- accel/accel.sh@16 -- # local accel_opc 00:07:53.428 11:43:43 -- accel/accel.sh@17 -- # local accel_module 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # IFS=: 00:07:53.428 11:43:43 -- accel/accel.sh@19 -- # read -r var val 00:07:53.428 11:43:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:53.428 11:43:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:53.428 11:43:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:53.428 11:43:43 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:53.428 11:43:43 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:53.428 11:43:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:53.428 11:43:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:53.428 11:43:43 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:53.428 11:43:43 -- accel/accel.sh@40 -- # local IFS=, 00:07:53.428 11:43:43 -- accel/accel.sh@41 -- # jq -r . 00:07:53.428 [2024-04-18 11:43:43.718002] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:53.428 [2024-04-18 11:43:43.718077] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359365 ] 00:07:53.428 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.428 [2024-04-18 11:43:43.857059] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.688 [2024-04-18 11:43:44.026391] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val= 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val= 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val=0x1 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val= 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val= 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val=compare 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@23 -- # accel_opc=compare 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val= 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val=software 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@22 -- # accel_module=software 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val=32 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val=32 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val=1 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val=Yes 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val= 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:53.688 11:43:44 -- accel/accel.sh@20 -- # val= 00:07:53.688 11:43:44 -- accel/accel.sh@21 -- # case "$var" in 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # IFS=: 00:07:53.688 11:43:44 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@20 -- # val= 00:07:55.666 11:43:45 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # IFS=: 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@20 -- # val= 00:07:55.666 11:43:45 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # IFS=: 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@20 -- # val= 00:07:55.666 11:43:45 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # IFS=: 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@20 -- # val= 00:07:55.666 11:43:45 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # IFS=: 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@20 -- # val= 00:07:55.666 11:43:45 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # IFS=: 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@20 -- # val= 00:07:55.666 11:43:45 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # IFS=: 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.666 11:43:45 -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:55.666 11:43:45 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.666 00:07:55.666 real 0m2.106s 00:07:55.666 user 0m1.879s 00:07:55.666 sys 0m0.225s 00:07:55.666 11:43:45 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:55.666 11:43:45 -- common/autotest_common.sh@10 -- # set +x 00:07:55.666 ************************************ 00:07:55.666 END TEST accel_compare 00:07:55.666 ************************************ 00:07:55.666 11:43:45 -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:55.666 11:43:45 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:07:55.666 11:43:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:55.666 11:43:45 -- common/autotest_common.sh@10 -- # set +x 00:07:55.666 ************************************ 00:07:55.666 START TEST accel_xor 00:07:55.666 ************************************ 00:07:55.666 11:43:45 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y 00:07:55.666 11:43:45 -- accel/accel.sh@16 -- # local accel_opc 00:07:55.666 11:43:45 -- accel/accel.sh@17 -- # local accel_module 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # IFS=: 00:07:55.666 11:43:45 -- accel/accel.sh@19 -- # read -r var val 00:07:55.666 11:43:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:55.666 11:43:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:55.666 11:43:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:55.666 11:43:45 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.666 11:43:45 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.666 11:43:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.666 11:43:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.666 11:43:45 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:55.666 11:43:45 -- accel/accel.sh@40 -- # local IFS=, 00:07:55.666 11:43:45 -- accel/accel.sh@41 -- # jq -r . 00:07:55.666 [2024-04-18 11:43:46.015958] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:55.666 [2024-04-18 11:43:46.016027] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid359733 ] 00:07:55.666 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.666 [2024-04-18 11:43:46.155765] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.926 [2024-04-18 11:43:46.324287] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val= 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val= 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=0x1 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val= 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val= 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=xor 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@23 -- # accel_opc=xor 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=2 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val= 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=software 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@22 -- # accel_module=software 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=32 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=32 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=1 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val=Yes 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val= 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:55.926 11:43:46 -- accel/accel.sh@20 -- # val= 00:07:55.926 11:43:46 -- accel/accel.sh@21 -- # case "$var" in 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # IFS=: 00:07:55.926 11:43:46 -- accel/accel.sh@19 -- # read -r var val 00:07:57.834 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:57.834 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.834 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:57.834 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:57.834 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:57.834 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.834 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:57.834 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:57.835 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:57.835 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:57.835 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:57.835 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:57.835 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:57.835 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:57.835 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:57.835 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:57.835 11:43:48 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:57.835 11:43:48 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:57.835 11:43:48 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.835 00:07:57.835 real 0m2.098s 00:07:57.835 user 0m1.875s 00:07:57.835 sys 0m0.219s 00:07:57.835 11:43:48 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:57.835 11:43:48 -- common/autotest_common.sh@10 -- # set +x 00:07:57.835 ************************************ 00:07:57.835 END TEST accel_xor 00:07:57.835 ************************************ 00:07:57.835 11:43:48 -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:57.835 11:43:48 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:57.835 11:43:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:57.835 11:43:48 -- common/autotest_common.sh@10 -- # set +x 00:07:57.835 ************************************ 00:07:57.835 START TEST accel_xor 00:07:57.835 ************************************ 00:07:57.835 11:43:48 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y -x 3 00:07:57.835 11:43:48 -- accel/accel.sh@16 -- # local accel_opc 00:07:57.835 11:43:48 -- accel/accel.sh@17 -- # local accel_module 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:57.835 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:57.835 11:43:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:57.835 11:43:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:57.835 11:43:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:57.835 11:43:48 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.835 11:43:48 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.835 11:43:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.835 11:43:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.835 11:43:48 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:57.835 11:43:48 -- accel/accel.sh@40 -- # local IFS=, 00:07:57.835 11:43:48 -- accel/accel.sh@41 -- # jq -r . 00:07:57.835 [2024-04-18 11:43:48.296678] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:07:57.835 [2024-04-18 11:43:48.296751] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid360107 ] 00:07:57.835 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.094 [2024-04-18 11:43:48.435088] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.094 [2024-04-18 11:43:48.603256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val=0x1 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val=xor 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@23 -- # accel_opc=xor 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val=3 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.353 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:58.353 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.353 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val=software 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@22 -- # accel_module=software 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val=32 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val=32 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val=1 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val=Yes 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:07:58.354 11:43:48 -- accel/accel.sh@20 -- # val= 00:07:58.354 11:43:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # IFS=: 00:07:58.354 11:43:48 -- accel/accel.sh@19 -- # read -r var val 00:08:00.261 11:43:50 -- accel/accel.sh@20 -- # val= 00:08:00.261 11:43:50 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # IFS=: 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # read -r var val 00:08:00.261 11:43:50 -- accel/accel.sh@20 -- # val= 00:08:00.261 11:43:50 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # IFS=: 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # read -r var val 00:08:00.261 11:43:50 -- accel/accel.sh@20 -- # val= 00:08:00.261 11:43:50 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # IFS=: 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # read -r var val 00:08:00.261 11:43:50 -- accel/accel.sh@20 -- # val= 00:08:00.261 11:43:50 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # IFS=: 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # read -r var val 00:08:00.261 11:43:50 -- accel/accel.sh@20 -- # val= 00:08:00.261 11:43:50 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # IFS=: 00:08:00.261 11:43:50 -- accel/accel.sh@19 -- # read -r var val 00:08:00.261 11:43:50 -- accel/accel.sh@20 -- # val= 00:08:00.261 11:43:50 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.262 11:43:50 -- accel/accel.sh@19 -- # IFS=: 00:08:00.262 11:43:50 -- accel/accel.sh@19 -- # read -r var val 00:08:00.262 11:43:50 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:00.262 11:43:50 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:00.262 11:43:50 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:00.262 00:08:00.262 real 0m2.114s 00:08:00.262 user 0m1.880s 00:08:00.262 sys 0m0.234s 00:08:00.262 11:43:50 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:00.262 11:43:50 -- common/autotest_common.sh@10 -- # set +x 00:08:00.262 ************************************ 00:08:00.262 END TEST accel_xor 00:08:00.262 ************************************ 00:08:00.262 11:43:50 -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:00.262 11:43:50 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:08:00.262 11:43:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:00.262 11:43:50 -- common/autotest_common.sh@10 -- # set +x 00:08:00.262 ************************************ 00:08:00.262 START TEST accel_dif_verify 00:08:00.262 ************************************ 00:08:00.262 11:43:50 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_verify 00:08:00.262 11:43:50 -- accel/accel.sh@16 -- # local accel_opc 00:08:00.262 11:43:50 -- accel/accel.sh@17 -- # local accel_module 00:08:00.262 11:43:50 -- accel/accel.sh@19 -- # IFS=: 00:08:00.262 11:43:50 -- accel/accel.sh@19 -- # read -r var val 00:08:00.262 11:43:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:00.262 11:43:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:00.262 11:43:50 -- accel/accel.sh@12 -- # build_accel_config 00:08:00.262 11:43:50 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.262 11:43:50 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.262 11:43:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.262 11:43:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.262 11:43:50 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.262 11:43:50 -- accel/accel.sh@40 -- # local IFS=, 00:08:00.262 11:43:50 -- accel/accel.sh@41 -- # jq -r . 00:08:00.262 [2024-04-18 11:43:50.576578] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:00.262 [2024-04-18 11:43:50.576667] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid360319 ] 00:08:00.262 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.262 [2024-04-18 11:43:50.721141] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.521 [2024-04-18 11:43:50.899372] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val= 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val= 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val=0x1 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val= 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val= 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val=dif_verify 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val='512 bytes' 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val='8 bytes' 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val= 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val=software 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@22 -- # accel_module=software 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val=32 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val=32 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.521 11:43:51 -- accel/accel.sh@20 -- # val=1 00:08:00.521 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.521 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.522 11:43:51 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.522 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.522 11:43:51 -- accel/accel.sh@20 -- # val=No 00:08:00.522 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.522 11:43:51 -- accel/accel.sh@20 -- # val= 00:08:00.522 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:00.522 11:43:51 -- accel/accel.sh@20 -- # val= 00:08:00.522 11:43:51 -- accel/accel.sh@21 -- # case "$var" in 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # IFS=: 00:08:00.522 11:43:51 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@20 -- # val= 00:08:02.428 11:43:52 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # IFS=: 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@20 -- # val= 00:08:02.428 11:43:52 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # IFS=: 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@20 -- # val= 00:08:02.428 11:43:52 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # IFS=: 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@20 -- # val= 00:08:02.428 11:43:52 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # IFS=: 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@20 -- # val= 00:08:02.428 11:43:52 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # IFS=: 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@20 -- # val= 00:08:02.428 11:43:52 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # IFS=: 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:02.428 11:43:52 -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:02.428 11:43:52 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:02.428 00:08:02.428 real 0m2.124s 00:08:02.428 user 0m1.889s 00:08:02.428 sys 0m0.236s 00:08:02.428 11:43:52 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:02.428 11:43:52 -- common/autotest_common.sh@10 -- # set +x 00:08:02.428 ************************************ 00:08:02.428 END TEST accel_dif_verify 00:08:02.428 ************************************ 00:08:02.428 11:43:52 -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:02.428 11:43:52 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:08:02.428 11:43:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:02.428 11:43:52 -- common/autotest_common.sh@10 -- # set +x 00:08:02.428 ************************************ 00:08:02.428 START TEST accel_dif_generate 00:08:02.428 ************************************ 00:08:02.428 11:43:52 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate 00:08:02.428 11:43:52 -- accel/accel.sh@16 -- # local accel_opc 00:08:02.428 11:43:52 -- accel/accel.sh@17 -- # local accel_module 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # IFS=: 00:08:02.428 11:43:52 -- accel/accel.sh@19 -- # read -r var val 00:08:02.428 11:43:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:02.428 11:43:52 -- accel/accel.sh@12 -- # build_accel_config 00:08:02.428 11:43:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:02.428 11:43:52 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.428 11:43:52 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.428 11:43:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.428 11:43:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.428 11:43:52 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.428 11:43:52 -- accel/accel.sh@40 -- # local IFS=, 00:08:02.428 11:43:52 -- accel/accel.sh@41 -- # jq -r . 00:08:02.428 [2024-04-18 11:43:52.878571] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:02.429 [2024-04-18 11:43:52.878649] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid360691 ] 00:08:02.429 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.688 [2024-04-18 11:43:53.020397] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.688 [2024-04-18 11:43:53.188784] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val= 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val= 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val=0x1 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val= 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val= 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val=dif_generate 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val='512 bytes' 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val='8 bytes' 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val= 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val=software 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@22 -- # accel_module=software 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val=32 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val=32 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val=1 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.946 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.946 11:43:53 -- accel/accel.sh@20 -- # val=No 00:08:02.946 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.947 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.947 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.947 11:43:53 -- accel/accel.sh@20 -- # val= 00:08:02.947 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.947 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.947 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:02.947 11:43:53 -- accel/accel.sh@20 -- # val= 00:08:02.947 11:43:53 -- accel/accel.sh@21 -- # case "$var" in 00:08:02.947 11:43:53 -- accel/accel.sh@19 -- # IFS=: 00:08:02.947 11:43:53 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:54 -- accel/accel.sh@20 -- # val= 00:08:04.853 11:43:54 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # IFS=: 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:54 -- accel/accel.sh@20 -- # val= 00:08:04.853 11:43:54 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # IFS=: 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:54 -- accel/accel.sh@20 -- # val= 00:08:04.853 11:43:54 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # IFS=: 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:54 -- accel/accel.sh@20 -- # val= 00:08:04.853 11:43:54 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # IFS=: 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:54 -- accel/accel.sh@20 -- # val= 00:08:04.853 11:43:54 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # IFS=: 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:54 -- accel/accel.sh@20 -- # val= 00:08:04.853 11:43:54 -- accel/accel.sh@21 -- # case "$var" in 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # IFS=: 00:08:04.853 11:43:54 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:54 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:04.853 11:43:54 -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:04.853 11:43:54 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:04.853 00:08:04.853 real 0m2.113s 00:08:04.853 user 0m1.880s 00:08:04.853 sys 0m0.235s 00:08:04.853 11:43:54 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:04.853 11:43:54 -- common/autotest_common.sh@10 -- # set +x 00:08:04.853 ************************************ 00:08:04.853 END TEST accel_dif_generate 00:08:04.853 ************************************ 00:08:04.853 11:43:54 -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:04.853 11:43:54 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:08:04.853 11:43:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:04.853 11:43:54 -- common/autotest_common.sh@10 -- # set +x 00:08:04.853 ************************************ 00:08:04.853 START TEST accel_dif_generate_copy 00:08:04.853 ************************************ 00:08:04.853 11:43:55 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate_copy 00:08:04.853 11:43:55 -- accel/accel.sh@16 -- # local accel_opc 00:08:04.853 11:43:55 -- accel/accel.sh@17 -- # local accel_module 00:08:04.853 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:04.853 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:04.853 11:43:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:04.853 11:43:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:04.853 11:43:55 -- accel/accel.sh@12 -- # build_accel_config 00:08:04.853 11:43:55 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.853 11:43:55 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.853 11:43:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.853 11:43:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.853 11:43:55 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:04.853 11:43:55 -- accel/accel.sh@40 -- # local IFS=, 00:08:04.853 11:43:55 -- accel/accel.sh@41 -- # jq -r . 00:08:04.853 [2024-04-18 11:43:55.168972] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:04.853 [2024-04-18 11:43:55.169054] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361067 ] 00:08:04.853 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.853 [2024-04-18 11:43:55.312301] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.113 [2024-04-18 11:43:55.477618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val= 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val= 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val=0x1 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val= 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val= 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val= 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val=software 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@22 -- # accel_module=software 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val=32 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val=32 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val=1 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val=No 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val= 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:05.113 11:43:55 -- accel/accel.sh@20 -- # val= 00:08:05.113 11:43:55 -- accel/accel.sh@21 -- # case "$var" in 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # IFS=: 00:08:05.113 11:43:55 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.022 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.022 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.022 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.022 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.022 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.022 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:07.022 11:43:57 -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:07.022 11:43:57 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:07.022 00:08:07.022 real 0m2.096s 00:08:07.022 user 0m1.868s 00:08:07.022 sys 0m0.229s 00:08:07.022 11:43:57 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:07.022 11:43:57 -- common/autotest_common.sh@10 -- # set +x 00:08:07.022 ************************************ 00:08:07.022 END TEST accel_dif_generate_copy 00:08:07.022 ************************************ 00:08:07.022 11:43:57 -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:07.022 11:43:57 -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:07.022 11:43:57 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:08:07.022 11:43:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:07.022 11:43:57 -- common/autotest_common.sh@10 -- # set +x 00:08:07.022 ************************************ 00:08:07.022 START TEST accel_comp 00:08:07.022 ************************************ 00:08:07.022 11:43:57 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:07.022 11:43:57 -- accel/accel.sh@16 -- # local accel_opc 00:08:07.022 11:43:57 -- accel/accel.sh@17 -- # local accel_module 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.022 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.022 11:43:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:07.022 11:43:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:07.022 11:43:57 -- accel/accel.sh@12 -- # build_accel_config 00:08:07.022 11:43:57 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.022 11:43:57 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.022 11:43:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.022 11:43:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.022 11:43:57 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:07.022 11:43:57 -- accel/accel.sh@40 -- # local IFS=, 00:08:07.022 11:43:57 -- accel/accel.sh@41 -- # jq -r . 00:08:07.022 [2024-04-18 11:43:57.452355] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:07.022 [2024-04-18 11:43:57.452491] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361289 ] 00:08:07.022 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.281 [2024-04-18 11:43:57.596786] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.281 [2024-04-18 11:43:57.769071] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=0x1 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=compress 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@23 -- # accel_opc=compress 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=software 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@22 -- # accel_module=software 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=32 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=32 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=1 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.541 11:43:57 -- accel/accel.sh@20 -- # val=No 00:08:07.541 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.541 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.542 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.542 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.542 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.542 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:07.542 11:43:57 -- accel/accel.sh@20 -- # val= 00:08:07.542 11:43:57 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.542 11:43:57 -- accel/accel.sh@19 -- # IFS=: 00:08:07.542 11:43:57 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@20 -- # val= 00:08:09.447 11:43:59 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # IFS=: 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@20 -- # val= 00:08:09.447 11:43:59 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # IFS=: 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@20 -- # val= 00:08:09.447 11:43:59 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # IFS=: 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@20 -- # val= 00:08:09.447 11:43:59 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # IFS=: 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@20 -- # val= 00:08:09.447 11:43:59 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # IFS=: 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@20 -- # val= 00:08:09.447 11:43:59 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # IFS=: 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:09.447 11:43:59 -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:09.447 11:43:59 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:09.447 00:08:09.447 real 0m2.111s 00:08:09.447 user 0m1.876s 00:08:09.447 sys 0m0.237s 00:08:09.447 11:43:59 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:09.447 11:43:59 -- common/autotest_common.sh@10 -- # set +x 00:08:09.447 ************************************ 00:08:09.447 END TEST accel_comp 00:08:09.447 ************************************ 00:08:09.447 11:43:59 -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:09.447 11:43:59 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:08:09.447 11:43:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:09.447 11:43:59 -- common/autotest_common.sh@10 -- # set +x 00:08:09.447 ************************************ 00:08:09.447 START TEST accel_decomp 00:08:09.447 ************************************ 00:08:09.447 11:43:59 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:09.447 11:43:59 -- accel/accel.sh@16 -- # local accel_opc 00:08:09.447 11:43:59 -- accel/accel.sh@17 -- # local accel_module 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # IFS=: 00:08:09.447 11:43:59 -- accel/accel.sh@19 -- # read -r var val 00:08:09.447 11:43:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:09.447 11:43:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:08:09.447 11:43:59 -- accel/accel.sh@12 -- # build_accel_config 00:08:09.447 11:43:59 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.447 11:43:59 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.447 11:43:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.447 11:43:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.447 11:43:59 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:09.447 11:43:59 -- accel/accel.sh@40 -- # local IFS=, 00:08:09.447 11:43:59 -- accel/accel.sh@41 -- # jq -r . 00:08:09.447 [2024-04-18 11:43:59.752399] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:09.447 [2024-04-18 11:43:59.752489] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid361651 ] 00:08:09.447 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.447 [2024-04-18 11:43:59.894318] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.708 [2024-04-18 11:44:00.071196] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=0x1 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=decompress 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=software 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@22 -- # accel_module=software 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=32 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=32 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=1 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val=Yes 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:09.708 11:44:00 -- accel/accel.sh@20 -- # val= 00:08:09.708 11:44:00 -- accel/accel.sh@21 -- # case "$var" in 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # IFS=: 00:08:09.708 11:44:00 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:01 -- accel/accel.sh@20 -- # val= 00:08:11.614 11:44:01 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:01 -- accel/accel.sh@20 -- # val= 00:08:11.614 11:44:01 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:01 -- accel/accel.sh@20 -- # val= 00:08:11.614 11:44:01 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:01 -- accel/accel.sh@20 -- # val= 00:08:11.614 11:44:01 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:01 -- accel/accel.sh@20 -- # val= 00:08:11.614 11:44:01 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:01 -- accel/accel.sh@20 -- # val= 00:08:11.614 11:44:01 -- accel/accel.sh@21 -- # case "$var" in 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 11:44:01 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:01 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.614 11:44:01 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.614 11:44:01 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.614 00:08:11.614 real 0m2.162s 00:08:11.614 user 0m1.926s 00:08:11.614 sys 0m0.237s 00:08:11.614 11:44:01 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:11.614 11:44:01 -- common/autotest_common.sh@10 -- # set +x 00:08:11.614 ************************************ 00:08:11.614 END TEST accel_decomp 00:08:11.614 ************************************ 00:08:11.614 11:44:01 -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.614 11:44:01 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:08:11.614 11:44:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:11.614 11:44:01 -- common/autotest_common.sh@10 -- # set +x 00:08:11.614 ************************************ 00:08:11.614 START TEST accel_decmop_full 00:08:11.614 ************************************ 00:08:11.614 11:44:02 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.614 11:44:02 -- accel/accel.sh@16 -- # local accel_opc 00:08:11.614 11:44:02 -- accel/accel.sh@17 -- # local accel_module 00:08:11.614 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:11.614 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:11.614 11:44:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.614 11:44:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:11.614 11:44:02 -- accel/accel.sh@12 -- # build_accel_config 00:08:11.614 11:44:02 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.614 11:44:02 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.614 11:44:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.614 11:44:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.614 11:44:02 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.614 11:44:02 -- accel/accel.sh@40 -- # local IFS=, 00:08:11.614 11:44:02 -- accel/accel.sh@41 -- # jq -r . 00:08:11.614 [2024-04-18 11:44:02.093229] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:11.614 [2024-04-18 11:44:02.093327] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362029 ] 00:08:11.873 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.873 [2024-04-18 11:44:02.235154] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.873 [2024-04-18 11:44:02.403239] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=0x1 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=decompress 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=software 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@22 -- # accel_module=software 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=32 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=32 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=1 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val=Yes 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:12.133 11:44:02 -- accel/accel.sh@20 -- # val= 00:08:12.133 11:44:02 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # IFS=: 00:08:12.133 11:44:02 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.041 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.041 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.041 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.041 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.041 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.041 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:14.041 11:44:04 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:14.041 11:44:04 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:14.041 00:08:14.041 real 0m2.122s 00:08:14.041 user 0m1.884s 00:08:14.041 sys 0m0.239s 00:08:14.041 11:44:04 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:14.041 11:44:04 -- common/autotest_common.sh@10 -- # set +x 00:08:14.041 ************************************ 00:08:14.041 END TEST accel_decmop_full 00:08:14.041 ************************************ 00:08:14.041 11:44:04 -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.041 11:44:04 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:08:14.041 11:44:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:14.041 11:44:04 -- common/autotest_common.sh@10 -- # set +x 00:08:14.041 ************************************ 00:08:14.041 START TEST accel_decomp_mcore 00:08:14.041 ************************************ 00:08:14.041 11:44:04 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.041 11:44:04 -- accel/accel.sh@16 -- # local accel_opc 00:08:14.041 11:44:04 -- accel/accel.sh@17 -- # local accel_module 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.041 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.041 11:44:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.041 11:44:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:14.041 11:44:04 -- accel/accel.sh@12 -- # build_accel_config 00:08:14.041 11:44:04 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.041 11:44:04 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.042 11:44:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.042 11:44:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.042 11:44:04 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.042 11:44:04 -- accel/accel.sh@40 -- # local IFS=, 00:08:14.042 11:44:04 -- accel/accel.sh@41 -- # jq -r . 00:08:14.042 [2024-04-18 11:44:04.415185] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:14.042 [2024-04-18 11:44:04.415259] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362399 ] 00:08:14.042 EAL: No free 2048 kB hugepages reported on node 1 00:08:14.042 [2024-04-18 11:44:04.558645] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:14.301 [2024-04-18 11:44:04.733087] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:14.301 [2024-04-18 11:44:04.733142] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:14.301 [2024-04-18 11:44:04.733199] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.301 [2024-04-18 11:44:04.733208] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=0xf 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=decompress 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=software 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@22 -- # accel_module=software 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=32 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=32 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=1 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val=Yes 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:14.560 11:44:04 -- accel/accel.sh@20 -- # val= 00:08:14.560 11:44:04 -- accel/accel.sh@21 -- # case "$var" in 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # IFS=: 00:08:14.560 11:44:04 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@20 -- # val= 00:08:16.465 11:44:06 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.465 11:44:06 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:16.465 11:44:06 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:16.465 11:44:06 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:16.465 00:08:16.465 real 0m2.170s 00:08:16.465 user 0m6.553s 00:08:16.465 sys 0m0.254s 00:08:16.465 11:44:06 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:16.465 11:44:06 -- common/autotest_common.sh@10 -- # set +x 00:08:16.465 ************************************ 00:08:16.465 END TEST accel_decomp_mcore 00:08:16.465 ************************************ 00:08:16.465 11:44:06 -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.465 11:44:06 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:08:16.465 11:44:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:16.465 11:44:06 -- common/autotest_common.sh@10 -- # set +x 00:08:16.465 ************************************ 00:08:16.465 START TEST accel_decomp_full_mcore 00:08:16.465 ************************************ 00:08:16.465 11:44:06 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.465 11:44:06 -- accel/accel.sh@16 -- # local accel_opc 00:08:16.465 11:44:06 -- accel/accel.sh@17 -- # local accel_module 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # IFS=: 00:08:16.465 11:44:06 -- accel/accel.sh@19 -- # read -r var val 00:08:16.466 11:44:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.466 11:44:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:16.466 11:44:06 -- accel/accel.sh@12 -- # build_accel_config 00:08:16.466 11:44:06 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:16.466 11:44:06 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:16.466 11:44:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:16.466 11:44:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:16.466 11:44:06 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:16.466 11:44:06 -- accel/accel.sh@40 -- # local IFS=, 00:08:16.466 11:44:06 -- accel/accel.sh@41 -- # jq -r . 00:08:16.466 [2024-04-18 11:44:06.782826] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:16.466 [2024-04-18 11:44:06.782900] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362628 ] 00:08:16.466 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.466 [2024-04-18 11:44:06.923914] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:16.726 [2024-04-18 11:44:07.094834] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:16.726 [2024-04-18 11:44:07.094900] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:16.726 [2024-04-18 11:44:07.094955] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.726 [2024-04-18 11:44:07.094965] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=0xf 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=decompress 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=software 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@22 -- # accel_module=software 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=32 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=32 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=1 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val=Yes 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:16.726 11:44:07 -- accel/accel.sh@20 -- # val= 00:08:16.726 11:44:07 -- accel/accel.sh@21 -- # case "$var" in 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # IFS=: 00:08:16.726 11:44:07 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.633 11:44:08 -- accel/accel.sh@20 -- # val= 00:08:18.633 11:44:08 -- accel/accel.sh@21 -- # case "$var" in 00:08:18.633 11:44:08 -- accel/accel.sh@19 -- # IFS=: 00:08:18.634 11:44:08 -- accel/accel.sh@19 -- # read -r var val 00:08:18.634 11:44:08 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:18.634 11:44:08 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:18.634 11:44:08 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:18.634 00:08:18.634 real 0m2.171s 00:08:18.634 user 0m6.575s 00:08:18.634 sys 0m0.255s 00:08:18.634 11:44:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:18.634 11:44:08 -- common/autotest_common.sh@10 -- # set +x 00:08:18.634 ************************************ 00:08:18.634 END TEST accel_decomp_full_mcore 00:08:18.634 ************************************ 00:08:18.634 11:44:08 -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:18.634 11:44:08 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:08:18.634 11:44:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:18.634 11:44:08 -- common/autotest_common.sh@10 -- # set +x 00:08:18.634 ************************************ 00:08:18.634 START TEST accel_decomp_mthread 00:08:18.634 ************************************ 00:08:18.634 11:44:09 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:18.634 11:44:09 -- accel/accel.sh@16 -- # local accel_opc 00:08:18.634 11:44:09 -- accel/accel.sh@17 -- # local accel_module 00:08:18.634 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:18.634 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:18.634 11:44:09 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:18.634 11:44:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:18.634 11:44:09 -- accel/accel.sh@12 -- # build_accel_config 00:08:18.634 11:44:09 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:18.634 11:44:09 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:18.634 11:44:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.634 11:44:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.634 11:44:09 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:18.634 11:44:09 -- accel/accel.sh@40 -- # local IFS=, 00:08:18.634 11:44:09 -- accel/accel.sh@41 -- # jq -r . 00:08:18.634 [2024-04-18 11:44:09.130020] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:18.634 [2024-04-18 11:44:09.130106] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid362988 ] 00:08:18.893 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.894 [2024-04-18 11:44:09.270884] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.894 [2024-04-18 11:44:09.439681] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=0x1 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=decompress 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=software 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@22 -- # accel_module=software 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=32 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=32 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=2 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val=Yes 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:19.153 11:44:09 -- accel/accel.sh@20 -- # val= 00:08:19.153 11:44:09 -- accel/accel.sh@21 -- # case "$var" in 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # IFS=: 00:08:19.153 11:44:09 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.061 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.061 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.061 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.061 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.061 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.061 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.061 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:21.061 11:44:11 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:21.061 11:44:11 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:21.061 00:08:21.061 real 0m2.141s 00:08:21.061 user 0m1.914s 00:08:21.061 sys 0m0.241s 00:08:21.061 11:44:11 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:21.061 11:44:11 -- common/autotest_common.sh@10 -- # set +x 00:08:21.061 ************************************ 00:08:21.061 END TEST accel_decomp_mthread 00:08:21.061 ************************************ 00:08:21.061 11:44:11 -- accel/accel.sh@122 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:21.061 11:44:11 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:08:21.061 11:44:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:21.061 11:44:11 -- common/autotest_common.sh@10 -- # set +x 00:08:21.061 ************************************ 00:08:21.061 START TEST accel_deomp_full_mthread 00:08:21.061 ************************************ 00:08:21.061 11:44:11 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:21.061 11:44:11 -- accel/accel.sh@16 -- # local accel_opc 00:08:21.061 11:44:11 -- accel/accel.sh@17 -- # local accel_module 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.061 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.061 11:44:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:21.061 11:44:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:21.061 11:44:11 -- accel/accel.sh@12 -- # build_accel_config 00:08:21.061 11:44:11 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:21.061 11:44:11 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:21.061 11:44:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.061 11:44:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.061 11:44:11 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:21.061 11:44:11 -- accel/accel.sh@40 -- # local IFS=, 00:08:21.061 11:44:11 -- accel/accel.sh@41 -- # jq -r . 00:08:21.061 [2024-04-18 11:44:11.462451] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:21.061 [2024-04-18 11:44:11.462523] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363365 ] 00:08:21.061 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.061 [2024-04-18 11:44:11.598696] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.321 [2024-04-18 11:44:11.766114] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val=0x1 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val=decompress 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.581 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.581 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.581 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val=software 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@22 -- # accel_module=software 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val=32 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val=32 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val=2 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val=Yes 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:21.582 11:44:11 -- accel/accel.sh@20 -- # val= 00:08:21.582 11:44:11 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # IFS=: 00:08:21.582 11:44:11 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@20 -- # val= 00:08:23.491 11:44:13 -- accel/accel.sh@21 -- # case "$var" in 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # IFS=: 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@20 -- # val= 00:08:23.491 11:44:13 -- accel/accel.sh@21 -- # case "$var" in 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # IFS=: 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@20 -- # val= 00:08:23.491 11:44:13 -- accel/accel.sh@21 -- # case "$var" in 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # IFS=: 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@20 -- # val= 00:08:23.491 11:44:13 -- accel/accel.sh@21 -- # case "$var" in 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # IFS=: 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@20 -- # val= 00:08:23.491 11:44:13 -- accel/accel.sh@21 -- # case "$var" in 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # IFS=: 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@20 -- # val= 00:08:23.491 11:44:13 -- accel/accel.sh@21 -- # case "$var" in 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # IFS=: 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@20 -- # val= 00:08:23.491 11:44:13 -- accel/accel.sh@21 -- # case "$var" in 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # IFS=: 00:08:23.491 11:44:13 -- accel/accel.sh@19 -- # read -r var val 00:08:23.491 11:44:13 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:23.491 11:44:13 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:23.491 11:44:13 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:23.491 00:08:23.491 real 0m2.141s 00:08:23.491 user 0m1.931s 00:08:23.491 sys 0m0.227s 00:08:23.491 11:44:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:23.491 11:44:13 -- common/autotest_common.sh@10 -- # set +x 00:08:23.491 ************************************ 00:08:23.491 END TEST accel_deomp_full_mthread 00:08:23.491 ************************************ 00:08:23.491 11:44:13 -- accel/accel.sh@124 -- # [[ n == y ]] 00:08:23.491 11:44:13 -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:23.491 11:44:13 -- accel/accel.sh@137 -- # build_accel_config 00:08:23.491 11:44:13 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:08:23.491 11:44:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:23.491 11:44:13 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:23.491 11:44:13 -- common/autotest_common.sh@10 -- # set +x 00:08:23.491 11:44:13 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:23.491 11:44:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.491 11:44:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.491 11:44:13 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:23.491 11:44:13 -- accel/accel.sh@40 -- # local IFS=, 00:08:23.491 11:44:13 -- accel/accel.sh@41 -- # jq -r . 00:08:23.491 ************************************ 00:08:23.491 START TEST accel_dif_functional_tests 00:08:23.491 ************************************ 00:08:23.491 11:44:13 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:23.491 [2024-04-18 11:44:13.775807] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:23.491 [2024-04-18 11:44:13.775879] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363709 ] 00:08:23.491 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.491 [2024-04-18 11:44:13.912151] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:23.751 [2024-04-18 11:44:14.085992] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:23.751 [2024-04-18 11:44:14.086039] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.751 [2024-04-18 11:44:14.086046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:08:24.010 00:08:24.010 00:08:24.010 CUnit - A unit testing framework for C - Version 2.1-3 00:08:24.010 http://cunit.sourceforge.net/ 00:08:24.010 00:08:24.010 00:08:24.010 Suite: accel_dif 00:08:24.010 Test: verify: DIF generated, GUARD check ...passed 00:08:24.010 Test: verify: DIF generated, APPTAG check ...passed 00:08:24.010 Test: verify: DIF generated, REFTAG check ...passed 00:08:24.010 Test: verify: DIF not generated, GUARD check ...[2024-04-18 11:44:14.329653] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:24.010 [2024-04-18 11:44:14.329707] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:24.010 passed 00:08:24.010 Test: verify: DIF not generated, APPTAG check ...[2024-04-18 11:44:14.329753] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:24.010 [2024-04-18 11:44:14.329780] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:24.010 passed 00:08:24.010 Test: verify: DIF not generated, REFTAG check ...[2024-04-18 11:44:14.329811] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:24.010 [2024-04-18 11:44:14.329836] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:24.010 passed 00:08:24.010 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:24.010 Test: verify: APPTAG incorrect, APPTAG check ...[2024-04-18 11:44:14.329903] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:24.010 passed 00:08:24.010 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:24.010 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:24.010 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:24.010 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-04-18 11:44:14.330051] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:24.010 passed 00:08:24.010 Test: generate copy: DIF generated, GUARD check ...passed 00:08:24.010 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:24.010 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:24.010 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:24.010 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:24.010 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:24.010 Test: generate copy: iovecs-len validate ...[2024-04-18 11:44:14.330338] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:24.010 passed 00:08:24.010 Test: generate copy: buffer alignment validate ...passed 00:08:24.010 00:08:24.010 Run Summary: Type Total Ran Passed Failed Inactive 00:08:24.010 suites 1 1 n/a 0 0 00:08:24.010 tests 20 20 20 0 0 00:08:24.010 asserts 204 204 204 0 n/a 00:08:24.010 00:08:24.010 Elapsed time = 0.003 seconds 00:08:24.948 00:08:24.948 real 0m1.395s 00:08:24.948 user 0m2.683s 00:08:24.948 sys 0m0.263s 00:08:24.948 11:44:15 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:24.948 11:44:15 -- common/autotest_common.sh@10 -- # set +x 00:08:24.948 ************************************ 00:08:24.948 END TEST accel_dif_functional_tests 00:08:24.948 ************************************ 00:08:24.948 00:08:24.948 real 0m53.719s 00:08:24.948 user 0m56.029s 00:08:24.948 sys 0m8.782s 00:08:24.948 11:44:15 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:24.948 11:44:15 -- common/autotest_common.sh@10 -- # set +x 00:08:24.948 ************************************ 00:08:24.948 END TEST accel 00:08:24.948 ************************************ 00:08:24.948 11:44:15 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:24.948 11:44:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:24.948 11:44:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:24.948 11:44:15 -- common/autotest_common.sh@10 -- # set +x 00:08:24.948 ************************************ 00:08:24.948 START TEST accel_rpc 00:08:24.948 ************************************ 00:08:24.948 11:44:15 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:24.948 * Looking for test storage... 00:08:24.948 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:08:24.948 11:44:15 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:24.948 11:44:15 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=363987 00:08:24.948 11:44:15 -- accel/accel_rpc.sh@15 -- # waitforlisten 363987 00:08:24.948 11:44:15 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:24.948 11:44:15 -- common/autotest_common.sh@817 -- # '[' -z 363987 ']' 00:08:24.948 11:44:15 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:24.948 11:44:15 -- common/autotest_common.sh@822 -- # local max_retries=100 00:08:24.948 11:44:15 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:24.948 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:24.948 11:44:15 -- common/autotest_common.sh@826 -- # xtrace_disable 00:08:24.948 11:44:15 -- common/autotest_common.sh@10 -- # set +x 00:08:25.208 [2024-04-18 11:44:15.538688] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:25.208 [2024-04-18 11:44:15.538795] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid363987 ] 00:08:25.208 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.208 [2024-04-18 11:44:15.682574] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.467 [2024-04-18 11:44:15.850504] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.036 11:44:16 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:08:26.036 11:44:16 -- common/autotest_common.sh@850 -- # return 0 00:08:26.036 11:44:16 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:26.036 11:44:16 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:26.036 11:44:16 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:26.036 11:44:16 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:26.036 11:44:16 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:26.036 11:44:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:26.036 11:44:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:26.036 11:44:16 -- common/autotest_common.sh@10 -- # set +x 00:08:26.036 ************************************ 00:08:26.036 START TEST accel_assign_opcode 00:08:26.036 ************************************ 00:08:26.036 11:44:16 -- common/autotest_common.sh@1111 -- # accel_assign_opcode_test_suite 00:08:26.036 11:44:16 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:26.036 11:44:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:08:26.036 11:44:16 -- common/autotest_common.sh@10 -- # set +x 00:08:26.036 [2024-04-18 11:44:16.492808] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:26.036 11:44:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:08:26.036 11:44:16 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:26.036 11:44:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:08:26.036 11:44:16 -- common/autotest_common.sh@10 -- # set +x 00:08:26.037 [2024-04-18 11:44:16.500827] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:26.037 11:44:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:08:26.037 11:44:16 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:26.037 11:44:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:08:26.037 11:44:16 -- common/autotest_common.sh@10 -- # set +x 00:08:26.668 11:44:17 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:08:26.668 11:44:17 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:26.668 11:44:17 -- common/autotest_common.sh@549 -- # xtrace_disable 00:08:26.668 11:44:17 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:26.668 11:44:17 -- common/autotest_common.sh@10 -- # set +x 00:08:26.668 11:44:17 -- accel/accel_rpc.sh@42 -- # grep software 00:08:26.668 11:44:17 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:08:26.668 software 00:08:26.668 00:08:26.668 real 0m0.641s 00:08:26.668 user 0m0.046s 00:08:26.668 sys 0m0.016s 00:08:26.668 11:44:17 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:26.668 11:44:17 -- common/autotest_common.sh@10 -- # set +x 00:08:26.668 ************************************ 00:08:26.668 END TEST accel_assign_opcode 00:08:26.668 ************************************ 00:08:26.668 11:44:17 -- accel/accel_rpc.sh@55 -- # killprocess 363987 00:08:26.668 11:44:17 -- common/autotest_common.sh@936 -- # '[' -z 363987 ']' 00:08:26.668 11:44:17 -- common/autotest_common.sh@940 -- # kill -0 363987 00:08:26.668 11:44:17 -- common/autotest_common.sh@941 -- # uname 00:08:26.668 11:44:17 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:08:26.668 11:44:17 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 363987 00:08:26.668 11:44:17 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:08:26.668 11:44:17 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:08:26.668 11:44:17 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 363987' 00:08:26.668 killing process with pid 363987 00:08:26.668 11:44:17 -- common/autotest_common.sh@955 -- # kill 363987 00:08:26.668 11:44:17 -- common/autotest_common.sh@960 -- # wait 363987 00:08:28.577 00:08:28.577 real 0m3.360s 00:08:28.577 user 0m3.272s 00:08:28.577 sys 0m0.685s 00:08:28.577 11:44:18 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:28.577 11:44:18 -- common/autotest_common.sh@10 -- # set +x 00:08:28.577 ************************************ 00:08:28.577 END TEST accel_rpc 00:08:28.577 ************************************ 00:08:28.577 11:44:18 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:08:28.577 11:44:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:28.577 11:44:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:28.577 11:44:18 -- common/autotest_common.sh@10 -- # set +x 00:08:28.577 ************************************ 00:08:28.577 START TEST app_cmdline 00:08:28.577 ************************************ 00:08:28.577 11:44:18 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:08:28.577 * Looking for test storage... 00:08:28.577 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:28.577 11:44:19 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:28.577 11:44:19 -- app/cmdline.sh@17 -- # spdk_tgt_pid=364527 00:08:28.577 11:44:19 -- app/cmdline.sh@18 -- # waitforlisten 364527 00:08:28.577 11:44:19 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:28.577 11:44:19 -- common/autotest_common.sh@817 -- # '[' -z 364527 ']' 00:08:28.577 11:44:19 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:28.577 11:44:19 -- common/autotest_common.sh@822 -- # local max_retries=100 00:08:28.577 11:44:19 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:28.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:28.577 11:44:19 -- common/autotest_common.sh@826 -- # xtrace_disable 00:08:28.577 11:44:19 -- common/autotest_common.sh@10 -- # set +x 00:08:28.577 [2024-04-18 11:44:19.078992] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:28.577 [2024-04-18 11:44:19.079088] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid364527 ] 00:08:28.837 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.837 [2024-04-18 11:44:19.221540] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.096 [2024-04-18 11:44:19.389990] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.665 11:44:19 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:08:29.665 11:44:19 -- common/autotest_common.sh@850 -- # return 0 00:08:29.665 11:44:19 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:29.665 { 00:08:29.665 "version": "SPDK v24.05-pre git sha1 65b4e17c6", 00:08:29.665 "fields": { 00:08:29.665 "major": 24, 00:08:29.665 "minor": 5, 00:08:29.665 "patch": 0, 00:08:29.665 "suffix": "-pre", 00:08:29.665 "commit": "65b4e17c6" 00:08:29.665 } 00:08:29.665 } 00:08:29.665 11:44:20 -- app/cmdline.sh@22 -- # expected_methods=() 00:08:29.665 11:44:20 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:29.665 11:44:20 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:29.665 11:44:20 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:29.665 11:44:20 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:29.665 11:44:20 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:29.665 11:44:20 -- common/autotest_common.sh@549 -- # xtrace_disable 00:08:29.665 11:44:20 -- common/autotest_common.sh@10 -- # set +x 00:08:29.665 11:44:20 -- app/cmdline.sh@26 -- # sort 00:08:29.665 11:44:20 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:08:29.665 11:44:20 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:29.665 11:44:20 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:29.665 11:44:20 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:29.665 11:44:20 -- common/autotest_common.sh@638 -- # local es=0 00:08:29.665 11:44:20 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:29.665 11:44:20 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:29.665 11:44:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:08:29.665 11:44:20 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:29.665 11:44:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:08:29.665 11:44:20 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:29.665 11:44:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:08:29.665 11:44:20 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:08:29.665 11:44:20 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:08:29.665 11:44:20 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:29.925 request: 00:08:29.925 { 00:08:29.925 "method": "env_dpdk_get_mem_stats", 00:08:29.925 "req_id": 1 00:08:29.925 } 00:08:29.925 Got JSON-RPC error response 00:08:29.925 response: 00:08:29.925 { 00:08:29.925 "code": -32601, 00:08:29.925 "message": "Method not found" 00:08:29.925 } 00:08:29.925 11:44:20 -- common/autotest_common.sh@641 -- # es=1 00:08:29.925 11:44:20 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:08:29.925 11:44:20 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:08:29.925 11:44:20 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:08:29.925 11:44:20 -- app/cmdline.sh@1 -- # killprocess 364527 00:08:29.925 11:44:20 -- common/autotest_common.sh@936 -- # '[' -z 364527 ']' 00:08:29.925 11:44:20 -- common/autotest_common.sh@940 -- # kill -0 364527 00:08:29.925 11:44:20 -- common/autotest_common.sh@941 -- # uname 00:08:29.925 11:44:20 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:08:29.925 11:44:20 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 364527 00:08:29.925 11:44:20 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:08:29.925 11:44:20 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:08:29.925 11:44:20 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 364527' 00:08:29.925 killing process with pid 364527 00:08:29.925 11:44:20 -- common/autotest_common.sh@955 -- # kill 364527 00:08:29.925 11:44:20 -- common/autotest_common.sh@960 -- # wait 364527 00:08:31.831 00:08:31.831 real 0m3.069s 00:08:31.831 user 0m3.173s 00:08:31.831 sys 0m0.672s 00:08:31.831 11:44:21 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:31.831 11:44:21 -- common/autotest_common.sh@10 -- # set +x 00:08:31.831 ************************************ 00:08:31.831 END TEST app_cmdline 00:08:31.831 ************************************ 00:08:31.831 11:44:22 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:08:31.831 11:44:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:31.831 11:44:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:31.831 11:44:22 -- common/autotest_common.sh@10 -- # set +x 00:08:31.831 ************************************ 00:08:31.831 START TEST version 00:08:31.831 ************************************ 00:08:31.831 11:44:22 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:08:31.831 * Looking for test storage... 00:08:31.831 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:31.831 11:44:22 -- app/version.sh@17 -- # get_header_version major 00:08:31.831 11:44:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:31.832 11:44:22 -- app/version.sh@14 -- # cut -f2 00:08:31.832 11:44:22 -- app/version.sh@14 -- # tr -d '"' 00:08:31.832 11:44:22 -- app/version.sh@17 -- # major=24 00:08:31.832 11:44:22 -- app/version.sh@18 -- # get_header_version minor 00:08:31.832 11:44:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:31.832 11:44:22 -- app/version.sh@14 -- # cut -f2 00:08:31.832 11:44:22 -- app/version.sh@14 -- # tr -d '"' 00:08:31.832 11:44:22 -- app/version.sh@18 -- # minor=5 00:08:31.832 11:44:22 -- app/version.sh@19 -- # get_header_version patch 00:08:31.832 11:44:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:31.832 11:44:22 -- app/version.sh@14 -- # cut -f2 00:08:31.832 11:44:22 -- app/version.sh@14 -- # tr -d '"' 00:08:31.832 11:44:22 -- app/version.sh@19 -- # patch=0 00:08:31.832 11:44:22 -- app/version.sh@20 -- # get_header_version suffix 00:08:31.832 11:44:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:08:31.832 11:44:22 -- app/version.sh@14 -- # cut -f2 00:08:31.832 11:44:22 -- app/version.sh@14 -- # tr -d '"' 00:08:31.832 11:44:22 -- app/version.sh@20 -- # suffix=-pre 00:08:31.832 11:44:22 -- app/version.sh@22 -- # version=24.5 00:08:31.832 11:44:22 -- app/version.sh@25 -- # (( patch != 0 )) 00:08:31.832 11:44:22 -- app/version.sh@28 -- # version=24.5rc0 00:08:31.832 11:44:22 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:31.832 11:44:22 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:32.092 11:44:22 -- app/version.sh@30 -- # py_version=24.5rc0 00:08:32.092 11:44:22 -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:08:32.092 00:08:32.092 real 0m0.184s 00:08:32.092 user 0m0.095s 00:08:32.092 sys 0m0.137s 00:08:32.092 11:44:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:32.092 11:44:22 -- common/autotest_common.sh@10 -- # set +x 00:08:32.092 ************************************ 00:08:32.092 END TEST version 00:08:32.092 ************************************ 00:08:32.092 11:44:22 -- spdk/autotest.sh@184 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@194 -- # uname -s 00:08:32.092 11:44:22 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:08:32.092 11:44:22 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:08:32.092 11:44:22 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:08:32.092 11:44:22 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@254 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@258 -- # timing_exit lib 00:08:32.092 11:44:22 -- common/autotest_common.sh@716 -- # xtrace_disable 00:08:32.092 11:44:22 -- common/autotest_common.sh@10 -- # set +x 00:08:32.092 11:44:22 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@268 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@277 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@310 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@314 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@328 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@341 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@345 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@354 -- # '[' 0 -eq 1 ']' 00:08:32.092 11:44:22 -- spdk/autotest.sh@361 -- # [[ 0 -eq 1 ]] 00:08:32.092 11:44:22 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:32.092 11:44:22 -- spdk/autotest.sh@369 -- # [[ 1 -eq 1 ]] 00:08:32.092 11:44:22 -- spdk/autotest.sh@370 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:32.092 11:44:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:32.092 11:44:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:32.092 11:44:22 -- common/autotest_common.sh@10 -- # set +x 00:08:32.092 ************************************ 00:08:32.092 START TEST llvm_fuzz 00:08:32.092 ************************************ 00:08:32.092 11:44:22 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:08:32.363 * Looking for test storage... 00:08:32.363 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:08:32.363 11:44:22 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:08:32.363 11:44:22 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:08:32.363 11:44:22 -- common/autotest_common.sh@536 -- # fuzzers=() 00:08:32.363 11:44:22 -- common/autotest_common.sh@536 -- # local fuzzers 00:08:32.363 11:44:22 -- common/autotest_common.sh@538 -- # [[ -n '' ]] 00:08:32.363 11:44:22 -- common/autotest_common.sh@541 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:08:32.363 11:44:22 -- common/autotest_common.sh@542 -- # fuzzers=("${fuzzers[@]##*/}") 00:08:32.363 11:44:22 -- common/autotest_common.sh@545 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:08:32.363 11:44:22 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:08:32.363 11:44:22 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:08:32.363 11:44:22 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:08:32.363 11:44:22 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:32.363 11:44:22 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:32.363 11:44:22 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:32.363 11:44:22 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:32.363 11:44:22 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:32.363 11:44:22 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:32.363 11:44:22 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:32.363 11:44:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:32.363 11:44:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:32.363 11:44:22 -- common/autotest_common.sh@10 -- # set +x 00:08:32.363 ************************************ 00:08:32.363 START TEST nvmf_fuzz 00:08:32.363 ************************************ 00:08:32.363 11:44:22 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:08:32.624 * Looking for test storage... 00:08:32.624 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:32.624 11:44:23 -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:32.624 11:44:23 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:32.624 11:44:23 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:32.624 11:44:23 -- common/autotest_common.sh@34 -- # set -e 00:08:32.624 11:44:23 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:32.624 11:44:23 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:32.624 11:44:23 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:32.624 11:44:23 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:32.624 11:44:23 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:32.624 11:44:23 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:32.624 11:44:23 -- common/build_config.sh@2 -- # CONFIG_ASAN=y 00:08:32.624 11:44:23 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:32.624 11:44:23 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:32.624 11:44:23 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:32.624 11:44:23 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:32.624 11:44:23 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:32.625 11:44:23 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:32.625 11:44:23 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:32.625 11:44:23 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:32.625 11:44:23 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:32.625 11:44:23 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:32.625 11:44:23 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:32.625 11:44:23 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:32.625 11:44:23 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:32.625 11:44:23 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:32.625 11:44:23 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:32.625 11:44:23 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:32.625 11:44:23 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:32.625 11:44:23 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:32.625 11:44:23 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:32.625 11:44:23 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:32.625 11:44:23 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:32.625 11:44:23 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:32.625 11:44:23 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:32.625 11:44:23 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:32.625 11:44:23 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:32.625 11:44:23 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:32.625 11:44:23 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:32.625 11:44:23 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:32.625 11:44:23 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:32.625 11:44:23 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:32.625 11:44:23 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:32.625 11:44:23 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:32.625 11:44:23 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:32.625 11:44:23 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:32.625 11:44:23 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:32.625 11:44:23 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:32.625 11:44:23 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:32.625 11:44:23 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:32.625 11:44:23 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:32.625 11:44:23 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:32.625 11:44:23 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:32.625 11:44:23 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:32.625 11:44:23 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:32.625 11:44:23 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:32.625 11:44:23 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:32.625 11:44:23 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:32.625 11:44:23 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:32.625 11:44:23 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:32.625 11:44:23 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:32.625 11:44:23 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:32.625 11:44:23 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:08:32.625 11:44:23 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:08:32.625 11:44:23 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:08:32.625 11:44:23 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:08:32.625 11:44:23 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:08:32.625 11:44:23 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:08:32.625 11:44:23 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:08:32.625 11:44:23 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:08:32.625 11:44:23 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:08:32.625 11:44:23 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:08:32.625 11:44:23 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:08:32.625 11:44:23 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:08:32.625 11:44:23 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:08:32.625 11:44:23 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:08:32.625 11:44:23 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:08:32.625 11:44:23 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:32.625 11:44:23 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:08:32.625 11:44:23 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:08:32.625 11:44:23 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:08:32.625 11:44:23 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:08:32.625 11:44:23 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:08:32.625 11:44:23 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:08:32.625 11:44:23 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:08:32.625 11:44:23 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:08:32.625 11:44:23 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:08:32.625 11:44:23 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:08:32.625 11:44:23 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:08:32.625 11:44:23 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:32.625 11:44:23 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:08:32.625 11:44:23 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:08:32.625 11:44:23 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:32.625 11:44:23 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:32.625 11:44:23 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:32.625 11:44:23 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:32.625 11:44:23 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:32.625 11:44:23 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:32.625 11:44:23 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:32.625 11:44:23 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:32.625 11:44:23 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:32.625 11:44:23 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:32.625 11:44:23 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:32.625 11:44:23 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:32.625 11:44:23 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:32.625 11:44:23 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:32.625 11:44:23 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:32.625 11:44:23 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:32.625 #define SPDK_CONFIG_H 00:08:32.625 #define SPDK_CONFIG_APPS 1 00:08:32.625 #define SPDK_CONFIG_ARCH native 00:08:32.625 #define SPDK_CONFIG_ASAN 1 00:08:32.625 #undef SPDK_CONFIG_AVAHI 00:08:32.625 #undef SPDK_CONFIG_CET 00:08:32.625 #define SPDK_CONFIG_COVERAGE 1 00:08:32.625 #define SPDK_CONFIG_CROSS_PREFIX 00:08:32.625 #undef SPDK_CONFIG_CRYPTO 00:08:32.625 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:32.625 #undef SPDK_CONFIG_CUSTOMOCF 00:08:32.625 #undef SPDK_CONFIG_DAOS 00:08:32.625 #define SPDK_CONFIG_DAOS_DIR 00:08:32.625 #define SPDK_CONFIG_DEBUG 1 00:08:32.625 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:32.625 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:32.625 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:32.625 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:32.625 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:32.625 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:32.625 #define SPDK_CONFIG_EXAMPLES 1 00:08:32.625 #undef SPDK_CONFIG_FC 00:08:32.625 #define SPDK_CONFIG_FC_PATH 00:08:32.625 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:32.625 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:32.625 #undef SPDK_CONFIG_FUSE 00:08:32.625 #define SPDK_CONFIG_FUZZER 1 00:08:32.625 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:32.625 #undef SPDK_CONFIG_GOLANG 00:08:32.625 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:32.625 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:32.625 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:32.625 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:08:32.625 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:32.625 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:32.625 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:32.625 #define SPDK_CONFIG_IDXD 1 00:08:32.625 #undef SPDK_CONFIG_IDXD_KERNEL 00:08:32.625 #undef SPDK_CONFIG_IPSEC_MB 00:08:32.625 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:32.625 #define SPDK_CONFIG_ISAL 1 00:08:32.625 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:32.625 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:32.625 #define SPDK_CONFIG_LIBDIR 00:08:32.625 #undef SPDK_CONFIG_LTO 00:08:32.625 #define SPDK_CONFIG_MAX_LCORES 00:08:32.625 #define SPDK_CONFIG_NVME_CUSE 1 00:08:32.625 #undef SPDK_CONFIG_OCF 00:08:32.625 #define SPDK_CONFIG_OCF_PATH 00:08:32.625 #define SPDK_CONFIG_OPENSSL_PATH 00:08:32.625 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:32.625 #define SPDK_CONFIG_PGO_DIR 00:08:32.625 #undef SPDK_CONFIG_PGO_USE 00:08:32.625 #define SPDK_CONFIG_PREFIX /usr/local 00:08:32.625 #undef SPDK_CONFIG_RAID5F 00:08:32.625 #undef SPDK_CONFIG_RBD 00:08:32.625 #define SPDK_CONFIG_RDMA 1 00:08:32.625 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:32.625 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:32.625 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:32.625 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:32.625 #undef SPDK_CONFIG_SHARED 00:08:32.625 #undef SPDK_CONFIG_SMA 00:08:32.625 #define SPDK_CONFIG_TESTS 1 00:08:32.625 #undef SPDK_CONFIG_TSAN 00:08:32.625 #define SPDK_CONFIG_UBLK 1 00:08:32.625 #define SPDK_CONFIG_UBSAN 1 00:08:32.625 #undef SPDK_CONFIG_UNIT_TESTS 00:08:32.625 #undef SPDK_CONFIG_URING 00:08:32.625 #define SPDK_CONFIG_URING_PATH 00:08:32.625 #undef SPDK_CONFIG_URING_ZNS 00:08:32.625 #undef SPDK_CONFIG_USDT 00:08:32.625 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:32.625 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:32.625 #define SPDK_CONFIG_VFIO_USER 1 00:08:32.625 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:32.625 #define SPDK_CONFIG_VHOST 1 00:08:32.625 #define SPDK_CONFIG_VIRTIO 1 00:08:32.625 #undef SPDK_CONFIG_VTUNE 00:08:32.625 #define SPDK_CONFIG_VTUNE_DIR 00:08:32.625 #define SPDK_CONFIG_WERROR 1 00:08:32.626 #define SPDK_CONFIG_WPDK_DIR 00:08:32.626 #undef SPDK_CONFIG_XNVME 00:08:32.626 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:32.626 11:44:23 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:32.626 11:44:23 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:32.626 11:44:23 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:32.626 11:44:23 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:32.626 11:44:23 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:32.626 11:44:23 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.626 11:44:23 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.626 11:44:23 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.626 11:44:23 -- paths/export.sh@5 -- # export PATH 00:08:32.626 11:44:23 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:32.626 11:44:23 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:32.626 11:44:23 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:32.626 11:44:23 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:32.626 11:44:23 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:32.626 11:44:23 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:32.626 11:44:23 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:32.626 11:44:23 -- pm/common@67 -- # TEST_TAG=N/A 00:08:32.626 11:44:23 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:32.626 11:44:23 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:32.626 11:44:23 -- pm/common@71 -- # uname -s 00:08:32.626 11:44:23 -- pm/common@71 -- # PM_OS=Linux 00:08:32.626 11:44:23 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:32.626 11:44:23 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:08:32.626 11:44:23 -- pm/common@76 -- # [[ Linux == Linux ]] 00:08:32.626 11:44:23 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:08:32.626 11:44:23 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:08:32.626 11:44:23 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:32.626 11:44:23 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:32.626 11:44:23 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:08:32.626 11:44:23 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:08:32.626 11:44:23 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:32.626 11:44:23 -- common/autotest_common.sh@57 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:08:32.626 11:44:23 -- common/autotest_common.sh@61 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:32.626 11:44:23 -- common/autotest_common.sh@63 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:08:32.626 11:44:23 -- common/autotest_common.sh@65 -- # : 1 00:08:32.626 11:44:23 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:32.626 11:44:23 -- common/autotest_common.sh@67 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:08:32.626 11:44:23 -- common/autotest_common.sh@69 -- # : 00:08:32.626 11:44:23 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:08:32.626 11:44:23 -- common/autotest_common.sh@71 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:08:32.626 11:44:23 -- common/autotest_common.sh@73 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:08:32.626 11:44:23 -- common/autotest_common.sh@75 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:08:32.626 11:44:23 -- common/autotest_common.sh@77 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:32.626 11:44:23 -- common/autotest_common.sh@79 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:08:32.626 11:44:23 -- common/autotest_common.sh@81 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:08:32.626 11:44:23 -- common/autotest_common.sh@83 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:08:32.626 11:44:23 -- common/autotest_common.sh@85 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:08:32.626 11:44:23 -- common/autotest_common.sh@87 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:08:32.626 11:44:23 -- common/autotest_common.sh@89 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:08:32.626 11:44:23 -- common/autotest_common.sh@91 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:08:32.626 11:44:23 -- common/autotest_common.sh@93 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:08:32.626 11:44:23 -- common/autotest_common.sh@95 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:32.626 11:44:23 -- common/autotest_common.sh@97 -- # : 1 00:08:32.626 11:44:23 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:08:32.626 11:44:23 -- common/autotest_common.sh@99 -- # : 1 00:08:32.626 11:44:23 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:08:32.626 11:44:23 -- common/autotest_common.sh@101 -- # : rdma 00:08:32.626 11:44:23 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:32.626 11:44:23 -- common/autotest_common.sh@103 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:08:32.626 11:44:23 -- common/autotest_common.sh@105 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:08:32.626 11:44:23 -- common/autotest_common.sh@107 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:08:32.626 11:44:23 -- common/autotest_common.sh@109 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:08:32.626 11:44:23 -- common/autotest_common.sh@111 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:08:32.626 11:44:23 -- common/autotest_common.sh@113 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:08:32.626 11:44:23 -- common/autotest_common.sh@115 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:08:32.626 11:44:23 -- common/autotest_common.sh@117 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:32.626 11:44:23 -- common/autotest_common.sh@119 -- # : 1 00:08:32.626 11:44:23 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:08:32.626 11:44:23 -- common/autotest_common.sh@121 -- # : 1 00:08:32.626 11:44:23 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:08:32.626 11:44:23 -- common/autotest_common.sh@123 -- # : 00:08:32.626 11:44:23 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:32.626 11:44:23 -- common/autotest_common.sh@125 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:08:32.626 11:44:23 -- common/autotest_common.sh@127 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:08:32.626 11:44:23 -- common/autotest_common.sh@129 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:08:32.626 11:44:23 -- common/autotest_common.sh@131 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:08:32.626 11:44:23 -- common/autotest_common.sh@133 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:08:32.626 11:44:23 -- common/autotest_common.sh@135 -- # : 0 00:08:32.626 11:44:23 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:08:32.626 11:44:23 -- common/autotest_common.sh@137 -- # : 00:08:32.626 11:44:23 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:08:32.626 11:44:23 -- common/autotest_common.sh@139 -- # : true 00:08:32.627 11:44:23 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:08:32.627 11:44:23 -- common/autotest_common.sh@141 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:08:32.627 11:44:23 -- common/autotest_common.sh@143 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:08:32.627 11:44:23 -- common/autotest_common.sh@145 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:08:32.627 11:44:23 -- common/autotest_common.sh@147 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:08:32.627 11:44:23 -- common/autotest_common.sh@149 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:08:32.627 11:44:23 -- common/autotest_common.sh@151 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:08:32.627 11:44:23 -- common/autotest_common.sh@153 -- # : 00:08:32.627 11:44:23 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:08:32.627 11:44:23 -- common/autotest_common.sh@155 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:08:32.627 11:44:23 -- common/autotest_common.sh@157 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:08:32.627 11:44:23 -- common/autotest_common.sh@159 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:08:32.627 11:44:23 -- common/autotest_common.sh@161 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:08:32.627 11:44:23 -- common/autotest_common.sh@163 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:08:32.627 11:44:23 -- common/autotest_common.sh@166 -- # : 00:08:32.627 11:44:23 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:08:32.627 11:44:23 -- common/autotest_common.sh@168 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:08:32.627 11:44:23 -- common/autotest_common.sh@170 -- # : 0 00:08:32.627 11:44:23 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:32.627 11:44:23 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:32.627 11:44:23 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:32.627 11:44:23 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:32.627 11:44:23 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:32.627 11:44:23 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:32.627 11:44:23 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:32.627 11:44:23 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:08:32.627 11:44:23 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:32.627 11:44:23 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:32.627 11:44:23 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:32.627 11:44:23 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:32.627 11:44:23 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:32.627 11:44:23 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:08:32.627 11:44:23 -- common/autotest_common.sh@199 -- # cat 00:08:32.627 11:44:23 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:08:32.627 11:44:23 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:32.627 11:44:23 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:32.627 11:44:23 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:32.627 11:44:23 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:32.627 11:44:23 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:08:32.627 11:44:23 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:08:32.627 11:44:23 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:32.627 11:44:23 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:32.627 11:44:23 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:32.627 11:44:23 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:32.627 11:44:23 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:32.627 11:44:23 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:32.627 11:44:23 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:32.627 11:44:23 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:32.627 11:44:23 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:32.627 11:44:23 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:32.627 11:44:23 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:32.627 11:44:23 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:32.627 11:44:23 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:08:32.627 11:44:23 -- common/autotest_common.sh@252 -- # export valgrind= 00:08:32.627 11:44:23 -- common/autotest_common.sh@252 -- # valgrind= 00:08:32.627 11:44:23 -- common/autotest_common.sh@258 -- # uname -s 00:08:32.627 11:44:23 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:08:32.627 11:44:23 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:08:32.627 11:44:23 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:08:32.627 11:44:23 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:08:32.627 11:44:23 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:08:32.627 11:44:23 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:08:32.627 11:44:23 -- common/autotest_common.sh@268 -- # MAKE=make 00:08:32.627 11:44:23 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j72 00:08:32.627 11:44:23 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:08:32.627 11:44:23 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:08:32.627 11:44:23 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:08:32.627 11:44:23 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:08:32.627 11:44:23 -- common/autotest_common.sh@307 -- # [[ -z 365181 ]] 00:08:32.627 11:44:23 -- common/autotest_common.sh@307 -- # kill -0 365181 00:08:32.627 11:44:23 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:08:32.627 11:44:23 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:08:32.627 11:44:23 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:08:32.627 11:44:23 -- common/autotest_common.sh@320 -- # local mount target_dir 00:08:32.627 11:44:23 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:08:32.627 11:44:23 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:08:32.627 11:44:23 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:08:32.627 11:44:23 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:08:32.627 11:44:23 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.pVtwjm 00:08:32.627 11:44:23 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:32.627 11:44:23 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:08:32.627 11:44:23 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:08:32.627 11:44:23 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.pVtwjm/tests/nvmf /tmp/spdk.pVtwjm 00:08:32.627 11:44:23 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:08:32.627 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.627 11:44:23 -- common/autotest_common.sh@316 -- # df -T 00:08:32.627 11:44:23 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:08:32.628 11:44:23 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:08:32.628 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # avails["$mount"]=997285888 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:08:32.628 11:44:23 -- common/autotest_common.sh@352 -- # uses["$mount"]=4287143936 00:08:32.628 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # avails["$mount"]=86199869440 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # sizes["$mount"]=94508580864 00:08:32.628 11:44:23 -- common/autotest_common.sh@352 -- # uses["$mount"]=8308711424 00:08:32.628 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # avails["$mount"]=47251677184 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254290432 00:08:32.628 11:44:23 -- common/autotest_common.sh@352 -- # uses["$mount"]=2613248 00:08:32.628 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # avails["$mount"]=18895638528 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # sizes["$mount"]=18901716992 00:08:32.628 11:44:23 -- common/autotest_common.sh@352 -- # uses["$mount"]=6078464 00:08:32.628 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # avails["$mount"]=47253745664 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254290432 00:08:32.628 11:44:23 -- common/autotest_common.sh@352 -- # uses["$mount"]=544768 00:08:32.628 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # avails["$mount"]=9450852352 00:08:32.628 11:44:23 -- common/autotest_common.sh@351 -- # sizes["$mount"]=9450856448 00:08:32.628 11:44:23 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:08:32.628 11:44:23 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:32.628 11:44:23 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:08:32.628 * Looking for test storage... 00:08:32.628 11:44:23 -- common/autotest_common.sh@357 -- # local target_space new_size 00:08:32.628 11:44:23 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:08:32.628 11:44:23 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:32.628 11:44:23 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:32.628 11:44:23 -- common/autotest_common.sh@361 -- # mount=/ 00:08:32.628 11:44:23 -- common/autotest_common.sh@363 -- # target_space=86199869440 00:08:32.628 11:44:23 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:08:32.628 11:44:23 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:08:32.628 11:44:23 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:08:32.628 11:44:23 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:08:32.628 11:44:23 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:08:32.628 11:44:23 -- common/autotest_common.sh@370 -- # new_size=10523303936 00:08:32.628 11:44:23 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:32.628 11:44:23 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:32.628 11:44:23 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:32.628 11:44:23 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:32.628 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:08:32.628 11:44:23 -- common/autotest_common.sh@378 -- # return 0 00:08:32.628 11:44:23 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:08:32.628 11:44:23 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:08:32.628 11:44:23 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:32.628 11:44:23 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:32.628 11:44:23 -- common/autotest_common.sh@1673 -- # true 00:08:32.628 11:44:23 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:08:32.628 11:44:23 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:32.628 11:44:23 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:32.628 11:44:23 -- common/autotest_common.sh@27 -- # exec 00:08:32.628 11:44:23 -- common/autotest_common.sh@29 -- # exec 00:08:32.628 11:44:23 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:32.628 11:44:23 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:32.628 11:44:23 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:32.887 11:44:23 -- common/autotest_common.sh@18 -- # set -x 00:08:32.887 11:44:23 -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:08:32.887 11:44:23 -- ../common.sh@8 -- # pids=() 00:08:32.887 11:44:23 -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:32.887 11:44:23 -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:08:32.887 11:44:23 -- nvmf/run.sh@64 -- # fuzz_num=25 00:08:32.887 11:44:23 -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:08:32.887 11:44:23 -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:32.887 11:44:23 -- nvmf/run.sh@69 -- # mem_size=512 00:08:32.887 11:44:23 -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:08:32.887 11:44:23 -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:08:32.887 11:44:23 -- ../common.sh@69 -- # local fuzz_num=25 00:08:32.887 11:44:23 -- ../common.sh@70 -- # local time=1 00:08:32.887 11:44:23 -- ../common.sh@72 -- # (( i = 0 )) 00:08:32.887 11:44:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.887 11:44:23 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:32.887 11:44:23 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:08:32.887 11:44:23 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.887 11:44:23 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.887 11:44:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:32.887 11:44:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:08:32.887 11:44:23 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:32.887 11:44:23 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:32.887 11:44:23 -- nvmf/run.sh@34 -- # printf %02d 0 00:08:32.887 11:44:23 -- nvmf/run.sh@34 -- # port=4400 00:08:32.887 11:44:23 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:32.887 11:44:23 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:08:32.887 11:44:23 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.887 11:44:23 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:32.887 11:44:23 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:32.887 11:44:23 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:08:32.887 [2024-04-18 11:44:23.248872] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:32.887 [2024-04-18 11:44:23.248967] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid365320 ] 00:08:32.887 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.146 [2024-04-18 11:44:23.523854] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.146 [2024-04-18 11:44:23.676460] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.404 [2024-04-18 11:44:23.920721] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.404 [2024-04-18 11:44:23.936949] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:08:33.404 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.404 INFO: Seed: 1161286556 00:08:33.662 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:33.662 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:33.662 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:08:33.662 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.662 #2 INITED exec/s: 0 rss: 200Mb 00:08:33.663 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.663 This may also happen if the target rejected all inputs we tried so far 00:08:33.663 [2024-04-18 11:44:24.014607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:33.663 [2024-04-18 11:44:24.014656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.921 NEW_FUNC[1/669]: 0x549260 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:08:33.921 NEW_FUNC[2/669]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.921 [2024-04-18 11:44:24.375237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:33.921 [2024-04-18 11:44:24.375297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:33.921 #5 NEW cov: 11681 ft: 11654 corp: 2/128b lim: 320 exec/s: 0 rss: 216Mb L: 127/127 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:33.921 [2024-04-18 11:44:24.456242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:33.921 [2024-04-18 11:44:24.456285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.180 [2024-04-18 11:44:24.516726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.180 [2024-04-18 11:44:24.516768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.180 #7 NEW cov: 11705 ft: 11954 corp: 3/255b lim: 320 exec/s: 0 rss: 218Mb L: 127/127 MS: 1 CMP- DE: "[\251\357\002\317\375\004\000"- 00:08:34.180 [2024-04-18 11:44:24.590106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.180 [2024-04-18 11:44:24.590139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.180 [2024-04-18 11:44:24.640614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.180 [2024-04-18 11:44:24.640645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.180 #14 NEW cov: 11717 ft: 12073 corp: 4/382b lim: 320 exec/s: 0 rss: 220Mb L: 127/127 MS: 1 ChangeBit- 00:08:34.180 [2024-04-18 11:44:24.711803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (4a) qid:0 cid:4 nsid:4a4a4a4a cdw10:4a4a4a4a cdw11:4a4a4a4a SGL TRANSPORT DATA BLOCK TRANSPORT 0x4a4a4a4a4a4a4a4a 00:08:34.180 [2024-04-18 11:44:24.711838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.439 NEW_FUNC[1/2]: 0x1a77af0 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:08:34.439 NEW_FUNC[2/2]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.439 [2024-04-18 11:44:24.772047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (4a) qid:0 cid:4 nsid:4a4a4a4a cdw10:4a4a4a4a cdw11:4a4a4a4a SGL TRANSPORT DATA BLOCK TRANSPORT 0x4a4a4a4a4a4a4a4a 00:08:34.439 [2024-04-18 11:44:24.772078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.439 #16 NEW cov: 11844 ft: 12617 corp: 5/473b lim: 320 exec/s: 0 rss: 221Mb L: 91/127 MS: 1 InsertRepeatedBytes- 00:08:34.439 [2024-04-18 11:44:24.842627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.439 [2024-04-18 11:44:24.842658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.439 [2024-04-18 11:44:24.902858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.439 [2024-04-18 11:44:24.902888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.439 #18 NEW cov: 11844 ft: 12753 corp: 6/551b lim: 320 exec/s: 0 rss: 223Mb L: 78/127 MS: 1 EraseBytes- 00:08:34.439 [2024-04-18 11:44:24.967751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.439 [2024-04-18 11:44:24.967784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.698 [2024-04-18 11:44:25.018184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.698 [2024-04-18 11:44:25.018214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.698 #20 NEW cov: 11844 ft: 13017 corp: 7/678b lim: 320 exec/s: 20 rss: 224Mb L: 127/127 MS: 1 ShuffleBytes- 00:08:34.698 [2024-04-18 11:44:25.075529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.698 [2024-04-18 11:44:25.075563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.698 [2024-04-18 11:44:25.075643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0004fdcf cdw11:00000000 00:08:34.698 [2024-04-18 11:44:25.075660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.698 [2024-04-18 11:44:25.125929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.698 [2024-04-18 11:44:25.125958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.698 [2024-04-18 11:44:25.126037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0004fdcf cdw11:00000000 00:08:34.698 [2024-04-18 11:44:25.126055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.698 #22 NEW cov: 11844 ft: 13300 corp: 8/806b lim: 320 exec/s: 22 rss: 226Mb L: 128/128 MS: 1 InsertByte- 00:08:34.698 [2024-04-18 11:44:25.195595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.698 [2024-04-18 11:44:25.195626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.698 [2024-04-18 11:44:25.195700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.698 [2024-04-18 11:44:25.195719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.957 [2024-04-18 11:44:25.255844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.957 [2024-04-18 11:44:25.255873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.957 [2024-04-18 11:44:25.255972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.957 [2024-04-18 11:44:25.255990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.957 #24 NEW cov: 11844 ft: 13370 corp: 9/934b lim: 320 exec/s: 24 rss: 227Mb L: 128/128 MS: 1 InsertByte- 00:08:34.957 [2024-04-18 11:44:25.317012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.957 [2024-04-18 11:44:25.317049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.957 [2024-04-18 11:44:25.317133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:02efa95b cdw11:0004fdcf 00:08:34.957 [2024-04-18 11:44:25.317154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.957 [2024-04-18 11:44:25.377172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.957 [2024-04-18 11:44:25.377207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.957 [2024-04-18 11:44:25.377312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:02efa95b cdw11:0004fdcf 00:08:34.957 [2024-04-18 11:44:25.377329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:34.957 #26 NEW cov: 11844 ft: 13435 corp: 10/1089b lim: 320 exec/s: 26 rss: 228Mb L: 155/155 MS: 1 CopyPart- 00:08:34.957 [2024-04-18 11:44:25.450222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.957 [2024-04-18 11:44:25.450265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:34.957 [2024-04-18 11:44:25.500480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:34.957 [2024-04-18 11:44:25.500514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.216 #28 NEW cov: 11844 ft: 13519 corp: 11/1216b lim: 320 exec/s: 28 rss: 230Mb L: 127/155 MS: 1 ChangeByte- 00:08:35.216 [2024-04-18 11:44:25.565813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.217 [2024-04-18 11:44:25.565848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.217 [2024-04-18 11:44:25.565952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.217 [2024-04-18 11:44:25.565971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.217 [2024-04-18 11:44:25.616305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.217 [2024-04-18 11:44:25.616337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.217 [2024-04-18 11:44:25.616423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.217 [2024-04-18 11:44:25.616440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.217 #32 NEW cov: 11844 ft: 13625 corp: 12/1385b lim: 320 exec/s: 32 rss: 231Mb L: 169/169 MS: 3 EraseBytes-ChangeBinInt-CrossOver- 00:08:35.217 [2024-04-18 11:44:25.688993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (4a) qid:0 cid:4 nsid:5b4a4a4a cdw10:4a4a4a4a cdw11:4a4a4a4a SGL TRANSPORT DATA BLOCK TRANSPORT 0x4a4a4a4a4a4a4a4a 00:08:35.217 [2024-04-18 11:44:25.689030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.217 [2024-04-18 11:44:25.749194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (4a) qid:0 cid:4 nsid:5b4a4a4a cdw10:4a4a4a4a cdw11:4a4a4a4a SGL TRANSPORT DATA BLOCK TRANSPORT 0x4a4a4a4a4a4a4a4a 00:08:35.217 [2024-04-18 11:44:25.749226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.476 #34 NEW cov: 11844 ft: 13674 corp: 13/1484b lim: 320 exec/s: 34 rss: 232Mb L: 99/169 MS: 1 PersAutoDict- DE: "[\251\357\002\317\375\004\000"- 00:08:35.476 [2024-04-18 11:44:25.821077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.476 [2024-04-18 11:44:25.821110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.476 [2024-04-18 11:44:25.821192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.476 [2024-04-18 11:44:25.821210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.476 [2024-04-18 11:44:25.871255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.476 [2024-04-18 11:44:25.871284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.476 [2024-04-18 11:44:25.871362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.476 [2024-04-18 11:44:25.871379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:35.476 #36 NEW cov: 11844 ft: 13754 corp: 14/1628b lim: 320 exec/s: 36 rss: 233Mb L: 144/169 MS: 1 CrossOver- 00:08:35.476 [2024-04-18 11:44:25.935559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.476 [2024-04-18 11:44:25.935593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.476 [2024-04-18 11:44:25.985709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:35.476 [2024-04-18 11:44:25.985738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:35.476 #38 NEW cov: 11844 ft: 13802 corp: 15/1755b lim: 320 exec/s: 19 rss: 235Mb L: 127/169 MS: 1 ShuffleBytes- 00:08:35.476 #38 DONE cov: 11844 ft: 13802 corp: 15/1755b lim: 320 exec/s: 19 rss: 235Mb 00:08:35.476 ###### Recommended dictionary. ###### 00:08:35.476 "[\251\357\002\317\375\004\000" # Uses: 1 00:08:35.476 ###### End of recommended dictionary. ###### 00:08:35.476 Done 38 runs in 2 second(s) 00:08:36.046 11:44:26 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:08:36.046 11:44:26 -- ../common.sh@72 -- # (( i++ )) 00:08:36.046 11:44:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.046 11:44:26 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:36.046 11:44:26 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:08:36.046 11:44:26 -- nvmf/run.sh@24 -- # local timen=1 00:08:36.046 11:44:26 -- nvmf/run.sh@25 -- # local core=0x1 00:08:36.046 11:44:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:36.046 11:44:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:08:36.046 11:44:26 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:36.046 11:44:26 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:36.046 11:44:26 -- nvmf/run.sh@34 -- # printf %02d 1 00:08:36.046 11:44:26 -- nvmf/run.sh@34 -- # port=4401 00:08:36.046 11:44:26 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:36.046 11:44:26 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:08:36.046 11:44:26 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:36.046 11:44:26 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:36.046 11:44:26 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:36.046 11:44:26 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:08:36.046 [2024-04-18 11:44:26.496312] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:36.046 [2024-04-18 11:44:26.496406] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid365723 ] 00:08:36.046 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.305 [2024-04-18 11:44:26.768753] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.564 [2024-04-18 11:44:26.921584] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.823 [2024-04-18 11:44:27.180470] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:36.823 [2024-04-18 11:44:27.196705] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:08:36.823 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.823 INFO: Seed: 125024493 00:08:36.824 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:36.824 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:36.824 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:08:36.824 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.824 #2 INITED exec/s: 0 rss: 199Mb 00:08:36.824 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.824 This may also happen if the target rejected all inputs we tried so far 00:08:36.824 [2024-04-18 11:44:27.274139] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:36.824 [2024-04-18 11:44:27.274404] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:36.824 [2024-04-18 11:44:27.274685] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:36.824 [2024-04-18 11:44:27.275187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.824 [2024-04-18 11:44:27.275240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:36.824 [2024-04-18 11:44:27.275339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.824 [2024-04-18 11:44:27.275361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:36.824 [2024-04-18 11:44:27.275451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:36.824 [2024-04-18 11:44:27.275471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.083 NEW_FUNC[1/671]: 0x549cc0 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:08:37.083 NEW_FUNC[2/671]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.341 [2024-04-18 11:44:27.634980] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.341 [2024-04-18 11:44:27.635228] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.341 [2024-04-18 11:44:27.635457] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.635822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.635885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.636000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.636033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.636147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.636177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.342 #6 NEW cov: 11781 ft: 11754 corp: 2/24b lim: 30 exec/s: 0 rss: 217Mb L: 23/23 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:37.342 [2024-04-18 11:44:27.705214] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.705481] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.705709] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (57796) > len (1044) 00:08:37.342 [2024-04-18 11:44:27.706177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.706215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.706300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.706318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.706407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:010400fd cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.706430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.765427] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.765682] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.765926] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (57796) > len (1044) 00:08:37.342 [2024-04-18 11:44:27.766360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.766399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.766493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.766511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.766607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:010400fd cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.766624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.342 #8 NEW cov: 11833 ft: 12230 corp: 3/47b lim: 30 exec/s: 0 rss: 218Mb L: 23/23 MS: 1 CMP- DE: "\001\004\375\320\341\304\372\254"- 00:08:37.342 [2024-04-18 11:44:27.837555] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.837814] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000cdff 00:08:37.342 [2024-04-18 11:44:27.838050] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.838284] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.342 [2024-04-18 11:44:27.838769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.838803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.838887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.838905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.838995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.839012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.342 [2024-04-18 11:44:27.839094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.342 [2024-04-18 11:44:27.839118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.601 [2024-04-18 11:44:27.898191] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.601 [2024-04-18 11:44:27.898479] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000cdff 00:08:37.601 [2024-04-18 11:44:27.898716] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.601 [2024-04-18 11:44:27.898984] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.601 [2024-04-18 11:44:27.899452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.601 [2024-04-18 11:44:27.899486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.601 [2024-04-18 11:44:27.899577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.601 [2024-04-18 11:44:27.899595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.601 [2024-04-18 11:44:27.899678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.601 [2024-04-18 11:44:27.899696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.601 [2024-04-18 11:44:27.899783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.601 [2024-04-18 11:44:27.899802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.601 #10 NEW cov: 11845 ft: 13110 corp: 4/71b lim: 30 exec/s: 0 rss: 220Mb L: 24/24 MS: 1 InsertByte- 00:08:37.601 [2024-04-18 11:44:27.969708] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.601 [2024-04-18 11:44:27.969963] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d7ff 00:08:37.601 [2024-04-18 11:44:27.970218] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.601 [2024-04-18 11:44:27.970699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.601 [2024-04-18 11:44:27.970737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.601 [2024-04-18 11:44:27.970829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.601 [2024-04-18 11:44:27.970851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.601 [2024-04-18 11:44:27.970943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.601 [2024-04-18 11:44:27.970962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.602 [2024-04-18 11:44:28.019888] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.602 [2024-04-18 11:44:28.020141] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d7ff 00:08:37.602 [2024-04-18 11:44:28.020392] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.602 [2024-04-18 11:44:28.020885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.602 [2024-04-18 11:44:28.020919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.602 [2024-04-18 11:44:28.021009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.602 [2024-04-18 11:44:28.021028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.602 [2024-04-18 11:44:28.021112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.602 [2024-04-18 11:44:28.021131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.602 #12 NEW cov: 11931 ft: 13425 corp: 5/94b lim: 30 exec/s: 0 rss: 222Mb L: 23/24 MS: 1 ChangeByte- 00:08:37.602 [2024-04-18 11:44:28.091695] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (244740) > buf size (4096) 00:08:37.602 [2024-04-18 11:44:28.091951] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.602 [2024-04-18 11:44:28.092214] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d7ff 00:08:37.602 [2024-04-18 11:44:28.092442] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.602 [2024-04-18 11:44:28.092911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ef000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.602 [2024-04-18 11:44:28.092948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.602 [2024-04-18 11:44:28.093042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.602 [2024-04-18 11:44:28.093061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.602 [2024-04-18 11:44:28.093151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.602 [2024-04-18 11:44:28.093171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.602 [2024-04-18 11:44:28.093256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.602 [2024-04-18 11:44:28.093277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.602 [2024-04-18 11:44:28.151959] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (244740) > buf size (4096) 00:08:37.602 [2024-04-18 11:44:28.152199] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.602 [2024-04-18 11:44:28.152474] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d7ff 00:08:37.861 [2024-04-18 11:44:28.152709] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.153149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ef000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.153188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.153289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.153314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.153427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.153448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.153536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.153555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.861 #14 NEW cov: 11939 ft: 13627 corp: 6/123b lim: 30 exec/s: 0 rss: 223Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:37.861 [2024-04-18 11:44:28.220395] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.220664] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000cdff 00:08:37.861 [2024-04-18 11:44:28.220909] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.221165] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.221609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.221647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.221735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.221758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.221854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.221872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.221960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.221977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.280496] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.280757] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000cdff 00:08:37.861 [2024-04-18 11:44:28.280987] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.281230] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.281698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.281733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.281829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.281847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.281944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.281963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.861 [2024-04-18 11:44:28.282046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.861 [2024-04-18 11:44:28.282064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:37.861 #16 NEW cov: 11939 ft: 13682 corp: 7/147b lim: 30 exec/s: 16 rss: 225Mb L: 24/29 MS: 1 ShuffleBytes- 00:08:37.861 [2024-04-18 11:44:28.343916] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.861 [2024-04-18 11:44:28.344183] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.862 [2024-04-18 11:44:28.344458] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.862 [2024-04-18 11:44:28.344926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.862 [2024-04-18 11:44:28.344964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.862 [2024-04-18 11:44:28.345056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.862 [2024-04-18 11:44:28.345075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.862 [2024-04-18 11:44:28.345165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.862 [2024-04-18 11:44:28.345185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:37.862 [2024-04-18 11:44:28.394150] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.862 [2024-04-18 11:44:28.394428] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.862 [2024-04-18 11:44:28.394685] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:37.862 [2024-04-18 11:44:28.395130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.862 [2024-04-18 11:44:28.395168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:37.862 [2024-04-18 11:44:28.395259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.862 [2024-04-18 11:44:28.395287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:37.862 [2024-04-18 11:44:28.395373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:37.862 [2024-04-18 11:44:28.395394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.121 #18 NEW cov: 11939 ft: 13757 corp: 8/170b lim: 30 exec/s: 18 rss: 226Mb L: 23/29 MS: 1 CopyPart- 00:08:38.121 [2024-04-18 11:44:28.466642] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.121 [2024-04-18 11:44:28.466917] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d7ff 00:08:38.121 [2024-04-18 11:44:28.467164] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.121 [2024-04-18 11:44:28.467658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.467704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.467800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.467827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.467925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.467952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.516972] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.121 [2024-04-18 11:44:28.517235] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000d7ff 00:08:38.121 [2024-04-18 11:44:28.517494] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.121 [2024-04-18 11:44:28.517957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.517993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.518084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.518106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.518197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.518219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.121 #20 NEW cov: 11939 ft: 13816 corp: 9/193b lim: 30 exec/s: 20 rss: 227Mb L: 23/29 MS: 1 ShuffleBytes- 00:08:38.121 [2024-04-18 11:44:28.583998] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.121 [2024-04-18 11:44:28.584257] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:08:38.121 [2024-04-18 11:44:28.584510] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000004fd 00:08:38.121 [2024-04-18 11:44:28.584754] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000acff 00:08:38.121 [2024-04-18 11:44:28.585220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.585266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.585353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.585375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.585469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1cff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.585495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.585588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0e102c4 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.585608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.644253] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.121 [2024-04-18 11:44:28.644522] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:08:38.121 [2024-04-18 11:44:28.644762] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000004fd 00:08:38.121 [2024-04-18 11:44:28.645024] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000acff 00:08:38.121 [2024-04-18 11:44:28.645501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.645539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.645627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.645648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.645732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:1cff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.645751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.121 [2024-04-18 11:44:28.645844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:d0e102c4 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.121 [2024-04-18 11:44:28.645863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.380 #22 NEW cov: 11939 ft: 13984 corp: 10/219b lim: 30 exec/s: 22 rss: 229Mb L: 26/29 MS: 1 InsertRepeatedBytes- 00:08:38.380 [2024-04-18 11:44:28.718604] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.718857] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000cdff 00:08:38.380 [2024-04-18 11:44:28.719100] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.719364] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:08:38.380 [2024-04-18 11:44:28.719821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.719866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.719961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.719991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.720084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.720109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.720201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.720226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.768736] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.769010] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000cdff 00:08:38.380 [2024-04-18 11:44:28.769263] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.769515] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:08:38.380 [2024-04-18 11:44:28.769984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.770019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.770105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.770124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.770215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.770235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.770321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.770342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.380 #24 NEW cov: 11939 ft: 14047 corp: 11/244b lim: 30 exec/s: 24 rss: 230Mb L: 25/29 MS: 1 InsertByte- 00:08:38.380 [2024-04-18 11:44:28.842157] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.842433] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.842682] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (57796) > len (1044) 00:08:38.380 [2024-04-18 11:44:28.843132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ef0783ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.843166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.843255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.843275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.843357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:010400fd cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.843378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.892478] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.892725] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.380 [2024-04-18 11:44:28.892975] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (57796) > len (1044) 00:08:38.380 [2024-04-18 11:44:28.893436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ef0783ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.893470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.893553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.893573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.380 [2024-04-18 11:44:28.893662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:010400fd cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.380 [2024-04-18 11:44:28.893682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.380 #26 NEW cov: 11939 ft: 14111 corp: 12/267b lim: 30 exec/s: 26 rss: 231Mb L: 23/29 MS: 1 ChangeBinInt- 00:08:38.640 [2024-04-18 11:44:28.954220] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:28.954483] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:28.954722] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:08:38.640 [2024-04-18 11:44:28.955171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:28.955206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:28.955301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:28.955323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:28.955420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:28.955439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.640 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:38.640 [2024-04-18 11:44:29.004666] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:29.004945] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:29.005205] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:08:38.640 [2024-04-18 11:44:29.005664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.005697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.005795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.005816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.005907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.005926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.640 #28 NEW cov: 11956 ft: 14175 corp: 13/290b lim: 30 exec/s: 28 rss: 232Mb L: 23/29 MS: 1 CopyPart- 00:08:38.640 [2024-04-18 11:44:29.078864] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff21 00:08:38.640 [2024-04-18 11:44:29.079145] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:29.079396] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d0e1 00:08:38.640 [2024-04-18 11:44:29.079661] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:29.080116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ef0783ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.080152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.080242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.080263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.080355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff018104 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.080374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.080468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:c4fa83ac cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.080490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.139105] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff21 00:08:38.640 [2024-04-18 11:44:29.139351] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:29.139618] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000d0e1 00:08:38.640 [2024-04-18 11:44:29.139866] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:08:38.640 [2024-04-18 11:44:29.140298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ef0783ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.140331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.140419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.140450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.140543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff018104 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.140563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:38.640 [2024-04-18 11:44:29.140647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:c4fa83ac cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.640 [2024-04-18 11:44:29.140666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:38.640 #30 NEW cov: 11956 ft: 14257 corp: 14/314b lim: 30 exec/s: 30 rss: 234Mb L: 24/29 MS: 1 InsertByte- 00:08:38.900 [2024-04-18 11:44:29.200610] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000efff 00:08:38.900 [2024-04-18 11:44:29.201070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.900 [2024-04-18 11:44:29.201108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.900 [2024-04-18 11:44:29.250690] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000efff 00:08:38.900 [2024-04-18 11:44:29.251146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:efff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:38.900 [2024-04-18 11:44:29.251181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:38.900 #32 NEW cov: 11956 ft: 14675 corp: 15/322b lim: 30 exec/s: 16 rss: 235Mb L: 8/29 MS: 1 CrossOver- 00:08:38.900 #32 DONE cov: 11956 ft: 14675 corp: 15/322b lim: 30 exec/s: 16 rss: 235Mb 00:08:38.900 ###### Recommended dictionary. ###### 00:08:38.900 "\001\004\375\320\341\304\372\254" # Uses: 0 00:08:38.900 ###### End of recommended dictionary. ###### 00:08:38.900 Done 32 runs in 2 second(s) 00:08:39.157 11:44:29 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:08:39.158 11:44:29 -- ../common.sh@72 -- # (( i++ )) 00:08:39.158 11:44:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.158 11:44:29 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:39.158 11:44:29 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:08:39.158 11:44:29 -- nvmf/run.sh@24 -- # local timen=1 00:08:39.158 11:44:29 -- nvmf/run.sh@25 -- # local core=0x1 00:08:39.158 11:44:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:39.158 11:44:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:08:39.158 11:44:29 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:39.158 11:44:29 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:39.158 11:44:29 -- nvmf/run.sh@34 -- # printf %02d 2 00:08:39.158 11:44:29 -- nvmf/run.sh@34 -- # port=4402 00:08:39.158 11:44:29 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:39.416 11:44:29 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:08:39.416 11:44:29 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:39.416 11:44:29 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:39.416 11:44:29 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:39.416 11:44:29 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:08:39.416 [2024-04-18 11:44:29.762024] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:39.416 [2024-04-18 11:44:29.762134] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid366157 ] 00:08:39.416 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.675 [2024-04-18 11:44:30.036193] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.675 [2024-04-18 11:44:30.193272] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.933 [2024-04-18 11:44:30.442179] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:39.933 [2024-04-18 11:44:30.458422] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:08:39.933 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.933 INFO: Seed: 3386058799 00:08:40.192 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:40.192 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:40.192 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:08:40.192 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.192 #2 INITED exec/s: 0 rss: 200Mb 00:08:40.192 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.192 This may also happen if the target rejected all inputs we tried so far 00:08:40.192 [2024-04-18 11:44:30.514626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.192 [2024-04-18 11:44:30.514677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.192 [2024-04-18 11:44:30.514724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.192 [2024-04-18 11:44:30.514741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.192 [2024-04-18 11:44:30.514790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.192 [2024-04-18 11:44:30.514807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.192 [2024-04-18 11:44:30.514860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.192 [2024-04-18 11:44:30.514876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.450 NEW_FUNC[1/670]: 0x54cc90 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:08:40.450 NEW_FUNC[2/670]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.450 [2024-04-18 11:44:30.875618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.875676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.875749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.875771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.875840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.875860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.875922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.875943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.450 #11 NEW cov: 11736 ft: 11710 corp: 2/31b lim: 35 exec/s: 0 rss: 217Mb L: 30/30 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:40.450 [2024-04-18 11:44:30.929857] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:40.450 [2024-04-18 11:44:30.930106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.930141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.930202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.930220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.930274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.930290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.930341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:07ff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.930360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.979874] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:40.450 [2024-04-18 11:44:30.980115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.980146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.980205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.980223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.980276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.980293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.450 [2024-04-18 11:44:30.980354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:07ff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.450 [2024-04-18 11:44:30.980374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.716 #13 NEW cov: 11770 ft: 12202 corp: 3/61b lim: 35 exec/s: 0 rss: 218Mb L: 30/30 MS: 1 ChangeBinInt- 00:08:40.716 [2024-04-18 11:44:31.026823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.716 [2024-04-18 11:44:31.026857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.716 [2024-04-18 11:44:31.026919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.716 [2024-04-18 11:44:31.026937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.716 [2024-04-18 11:44:31.026993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.716 [2024-04-18 11:44:31.027010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.716 [2024-04-18 11:44:31.027064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.716 [2024-04-18 11:44:31.027081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.716 [2024-04-18 11:44:31.066952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.716 [2024-04-18 11:44:31.066984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.716 [2024-04-18 11:44:31.067045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.716 [2024-04-18 11:44:31.067063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.716 [2024-04-18 11:44:31.067119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.067135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.067189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.067205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.717 #15 NEW cov: 11782 ft: 12593 corp: 4/91b lim: 35 exec/s: 0 rss: 219Mb L: 30/30 MS: 1 ShuffleBytes- 00:08:40.717 [2024-04-18 11:44:31.113605] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:40.717 [2024-04-18 11:44:31.113859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.113889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.113943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.113960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.114018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.114034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.114088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:07ff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.114109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.163686] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:40.717 [2024-04-18 11:44:31.163921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.163951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.164010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.164026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.164086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.164104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.164162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:07ff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.164180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.717 #17 NEW cov: 11868 ft: 12789 corp: 5/121b lim: 35 exec/s: 0 rss: 222Mb L: 30/30 MS: 1 ChangeByte- 00:08:40.717 [2024-04-18 11:44:31.222318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.222358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.222421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.222441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.222496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.222513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.717 [2024-04-18 11:44:31.222572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:6d006e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.717 [2024-04-18 11:44:31.222589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.977 [2024-04-18 11:44:31.272478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.977 [2024-04-18 11:44:31.272511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.977 [2024-04-18 11:44:31.272570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.272587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.272646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.272663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.272719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:6d006e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.272736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.978 #19 NEW cov: 11868 ft: 12900 corp: 6/151b lim: 35 exec/s: 0 rss: 223Mb L: 30/30 MS: 1 CMP- DE: "nvmf"- 00:08:40.978 [2024-04-18 11:44:31.319582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.319615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.319677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.319694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.319748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.319767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.319821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.319841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.359671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.359703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.359765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.359783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.359838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.359854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.359909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.359925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:40.978 #21 NEW cov: 11868 ft: 12984 corp: 7/181b lim: 35 exec/s: 0 rss: 225Mb L: 30/30 MS: 1 CopyPart- 00:08:40.978 [2024-04-18 11:44:31.406401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.406439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.406495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.406512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.446503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.446534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.446609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.446627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.978 #26 NEW cov: 11868 ft: 13591 corp: 8/200b lim: 35 exec/s: 0 rss: 226Mb L: 19/30 MS: 4 ShuffleBytes-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:40.978 [2024-04-18 11:44:31.505163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.505196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.505257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.505277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.505333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.505350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:40.978 [2024-04-18 11:44:31.505407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:40.978 [2024-04-18 11:44:31.505428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.238 [2024-04-18 11:44:31.545286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.238 [2024-04-18 11:44:31.545319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.238 [2024-04-18 11:44:31.545382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.238 [2024-04-18 11:44:31.545400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.238 [2024-04-18 11:44:31.545459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.238 [2024-04-18 11:44:31.545476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.238 [2024-04-18 11:44:31.545532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.238 [2024-04-18 11:44:31.545548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.238 #28 NEW cov: 11868 ft: 13638 corp: 9/230b lim: 35 exec/s: 28 rss: 227Mb L: 30/30 MS: 1 CopyPart- 00:08:41.238 [2024-04-18 11:44:31.592170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.238 [2024-04-18 11:44:31.592203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.238 [2024-04-18 11:44:31.592262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.238 [2024-04-18 11:44:31.592279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.238 [2024-04-18 11:44:31.592336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.592353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.632275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.632306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.632368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.632385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.632446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.632466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.239 #30 NEW cov: 11868 ft: 13871 corp: 10/253b lim: 35 exec/s: 30 rss: 228Mb L: 23/30 MS: 1 EraseBytes- 00:08:41.239 [2024-04-18 11:44:31.678861] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:41.239 [2024-04-18 11:44:31.679101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.679132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.679192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.679210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.679270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00003fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.679287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.679343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:07ff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.679362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.729003] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:41.239 [2024-04-18 11:44:31.729242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.729274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.729334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.729351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.729417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00003fff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.729450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.729507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:07ff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.729526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.239 #32 NEW cov: 11868 ft: 14050 corp: 11/283b lim: 35 exec/s: 32 rss: 230Mb L: 30/30 MS: 1 ChangeByte- 00:08:41.239 [2024-04-18 11:44:31.776209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.776242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.776308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.776325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.776390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.776407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.776470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.776487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.239 [2024-04-18 11:44:31.776541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.239 [2024-04-18 11:44:31.776558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.816325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.816359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.816427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.816446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.816504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.816520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.816578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.816595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.816650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.816666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:41.500 #34 NEW cov: 11868 ft: 14173 corp: 12/318b lim: 35 exec/s: 34 rss: 231Mb L: 35/35 MS: 1 CopyPart- 00:08:41.500 [2024-04-18 11:44:31.864351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:2d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.864385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.864466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.864485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.864543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.864561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.864619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.864636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.904496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:2d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.904528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.904601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.904617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.904673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.904689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.904744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.904760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.500 #36 NEW cov: 11868 ft: 14233 corp: 13/348b lim: 35 exec/s: 36 rss: 232Mb L: 30/35 MS: 1 ChangeByte- 00:08:41.500 [2024-04-18 11:44:31.951812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:2d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.951846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.951907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.951924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.951980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.951996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:31.952054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:31.952071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:32.001897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:2d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:32.001929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:32.001985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:32.002002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:32.002056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:32.002073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.500 [2024-04-18 11:44:32.002127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.500 [2024-04-18 11:44:32.002165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.500 #38 NEW cov: 11868 ft: 14256 corp: 14/378b lim: 35 exec/s: 38 rss: 234Mb L: 30/35 MS: 1 ShuffleBytes- 00:08:41.761 [2024-04-18 11:44:32.051037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.051070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.051128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.051146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.051203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.051220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.051274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.051291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.091087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.091118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.091174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.091191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.091246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.091262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.091316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.091332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.761 #45 NEW cov: 11868 ft: 14264 corp: 15/411b lim: 35 exec/s: 45 rss: 235Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:08:41.761 [2024-04-18 11:44:32.138125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.138158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.138217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.138234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.138287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.138304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.138364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff006e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.138380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.138442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:6e7600ff cdw11:ff006d66 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.138459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.188226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.188257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.188315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.188332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.188386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.188403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.188467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff006e76 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.188484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.188539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:6e7600ff cdw11:ff006d66 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.188555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:41.761 #47 NEW cov: 11868 ft: 14316 corp: 16/446b lim: 35 exec/s: 47 rss: 236Mb L: 35/35 MS: 1 CopyPart- 00:08:41.761 [2024-04-18 11:44:32.236293] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:41.761 [2024-04-18 11:44:32.236558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.236591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.236651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.236696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.236753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff0007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.236772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:41.761 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:41.761 [2024-04-18 11:44:32.276483] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:08:41.761 [2024-04-18 11:44:32.276732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.276765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.276823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff007e cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.276840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:41.761 [2024-04-18 11:44:32.276894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff0007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:41.761 [2024-04-18 11:44:32.276913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.022 #49 NEW cov: 11885 ft: 14412 corp: 17/471b lim: 35 exec/s: 49 rss: 237Mb L: 25/35 MS: 1 EraseBytes- 00:08:42.022 [2024-04-18 11:44:32.336774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00f5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.336808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.336866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.336883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.336941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.336957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.337012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.337029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.386918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:ff00f5ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.386952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.387011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.387030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.387088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.387105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.387158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.387175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.022 #51 NEW cov: 11885 ft: 14423 corp: 18/501b lim: 35 exec/s: 51 rss: 239Mb L: 30/35 MS: 1 ChangeBinInt- 00:08:42.022 [2024-04-18 11:44:32.433330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:2d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.433363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.433430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.433447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.433504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.433521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.433582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.433599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.483467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff008a cdw11:2d00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.483499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.483557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.483574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.483627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.483644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:42.022 [2024-04-18 11:44:32.483703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:42.022 [2024-04-18 11:44:32.483720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:42.022 #53 NEW cov: 11885 ft: 14439 corp: 19/530b lim: 35 exec/s: 26 rss: 240Mb L: 29/35 MS: 1 EraseBytes- 00:08:42.022 #53 DONE cov: 11885 ft: 14439 corp: 19/530b lim: 35 exec/s: 26 rss: 240Mb 00:08:42.022 ###### Recommended dictionary. ###### 00:08:42.022 "nvmf" # Uses: 0 00:08:42.022 ###### End of recommended dictionary. ###### 00:08:42.022 Done 53 runs in 2 second(s) 00:08:42.592 11:44:32 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:08:42.592 11:44:32 -- ../common.sh@72 -- # (( i++ )) 00:08:42.592 11:44:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.592 11:44:32 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:42.592 11:44:32 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:08:42.592 11:44:32 -- nvmf/run.sh@24 -- # local timen=1 00:08:42.592 11:44:32 -- nvmf/run.sh@25 -- # local core=0x1 00:08:42.592 11:44:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:42.592 11:44:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:08:42.592 11:44:32 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:42.592 11:44:32 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:42.592 11:44:32 -- nvmf/run.sh@34 -- # printf %02d 3 00:08:42.592 11:44:32 -- nvmf/run.sh@34 -- # port=4403 00:08:42.592 11:44:32 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:42.592 11:44:32 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:08:42.592 11:44:32 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:42.592 11:44:32 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:42.592 11:44:32 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:42.592 11:44:32 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:08:42.592 [2024-04-18 11:44:32.992266] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:42.592 [2024-04-18 11:44:32.992363] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid366651 ] 00:08:42.592 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.851 [2024-04-18 11:44:33.258332] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.110 [2024-04-18 11:44:33.411176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.110 [2024-04-18 11:44:33.655953] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:43.369 [2024-04-18 11:44:33.672175] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:08:43.369 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.369 INFO: Seed: 2304071534 00:08:43.369 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:43.369 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:43.369 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:08:43.369 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.369 #2 INITED exec/s: 0 rss: 200Mb 00:08:43.369 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.369 This may also happen if the target rejected all inputs we tried so far 00:08:43.628 NEW_FUNC[1/659]: 0x54ecd0 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:08:43.628 NEW_FUNC[2/659]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.887 #7 NEW cov: 11644 ft: 11610 corp: 2/19b lim: 20 exec/s: 0 rss: 216Mb L: 18/18 MS: 4 CrossOver-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:43.887 #11 NEW cov: 11684 ft: 12446 corp: 3/23b lim: 20 exec/s: 0 rss: 219Mb L: 4/18 MS: 3 CopyPart-CrossOver-InsertByte- 00:08:43.887 #13 NEW cov: 11696 ft: 12742 corp: 4/41b lim: 20 exec/s: 0 rss: 220Mb L: 18/18 MS: 1 ChangeByte- 00:08:44.146 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.146 #17 NEW cov: 11804 ft: 13271 corp: 5/51b lim: 20 exec/s: 0 rss: 222Mb L: 10/18 MS: 3 InsertByte-ChangeBit-CMP- DE: "\000\004\375\324W\262O\320"- 00:08:44.146 #19 NEW cov: 11804 ft: 13463 corp: 6/69b lim: 20 exec/s: 0 rss: 223Mb L: 18/18 MS: 1 ShuffleBytes- 00:08:44.406 #21 NEW cov: 11804 ft: 13545 corp: 7/87b lim: 20 exec/s: 21 rss: 224Mb L: 18/18 MS: 1 CopyPart- 00:08:44.406 #23 NEW cov: 11804 ft: 13580 corp: 8/105b lim: 20 exec/s: 23 rss: 226Mb L: 18/18 MS: 1 ShuffleBytes- 00:08:44.665 #30 NEW cov: 11808 ft: 13706 corp: 9/120b lim: 20 exec/s: 30 rss: 227Mb L: 15/18 MS: 1 CrossOver- 00:08:44.665 #32 NEW cov: 11808 ft: 13747 corp: 10/138b lim: 20 exec/s: 32 rss: 228Mb L: 18/18 MS: 1 ChangeBinInt- 00:08:44.924 #34 NEW cov: 11808 ft: 13943 corp: 11/156b lim: 20 exec/s: 34 rss: 230Mb L: 18/18 MS: 1 CrossOver- 00:08:44.924 #36 NEW cov: 11808 ft: 14113 corp: 12/176b lim: 20 exec/s: 36 rss: 231Mb L: 20/20 MS: 1 CopyPart- 00:08:45.183 #38 NEW cov: 11808 ft: 14123 corp: 13/180b lim: 20 exec/s: 38 rss: 233Mb L: 4/20 MS: 1 ShuffleBytes- 00:08:45.183 #40 NEW cov: 11808 ft: 14234 corp: 14/198b lim: 20 exec/s: 40 rss: 234Mb L: 18/20 MS: 1 CopyPart- 00:08:45.443 #42 NEW cov: 11808 ft: 14311 corp: 15/216b lim: 20 exec/s: 21 rss: 236Mb L: 18/20 MS: 1 ChangeBit- 00:08:45.443 #42 DONE cov: 11808 ft: 14311 corp: 15/216b lim: 20 exec/s: 21 rss: 236Mb 00:08:45.443 ###### Recommended dictionary. ###### 00:08:45.443 "\000\004\375\324W\262O\320" # Uses: 0 00:08:45.443 ###### End of recommended dictionary. ###### 00:08:45.443 Done 42 runs in 2 second(s) 00:08:45.703 11:44:36 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:45.703 11:44:36 -- ../common.sh@72 -- # (( i++ )) 00:08:45.703 11:44:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.703 11:44:36 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:45.703 11:44:36 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:45.703 11:44:36 -- nvmf/run.sh@24 -- # local timen=1 00:08:45.703 11:44:36 -- nvmf/run.sh@25 -- # local core=0x1 00:08:45.703 11:44:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:45.703 11:44:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:45.703 11:44:36 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:45.703 11:44:36 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:45.703 11:44:36 -- nvmf/run.sh@34 -- # printf %02d 4 00:08:45.703 11:44:36 -- nvmf/run.sh@34 -- # port=4404 00:08:45.703 11:44:36 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:45.703 11:44:36 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:45.703 11:44:36 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:45.703 11:44:36 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:45.703 11:44:36 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:45.703 11:44:36 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:45.964 [2024-04-18 11:44:36.279835] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:45.964 [2024-04-18 11:44:36.279928] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid367034 ] 00:08:45.964 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.224 [2024-04-18 11:44:36.566666] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.224 [2024-04-18 11:44:36.722446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.484 [2024-04-18 11:44:36.968159] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:46.484 [2024-04-18 11:44:36.984377] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:46.484 INFO: Running with entropic power schedule (0xFF, 100). 00:08:46.484 INFO: Seed: 1322092006 00:08:46.484 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:46.484 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:46.484 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:46.484 INFO: A corpus is not provided, starting from an empty corpus 00:08:46.484 #2 INITED exec/s: 0 rss: 200Mb 00:08:46.484 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:46.484 This may also happen if the target rejected all inputs we tried so far 00:08:46.744 [2024-04-18 11:44:37.040397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.744 [2024-04-18 11:44:37.040442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:46.744 [2024-04-18 11:44:37.040513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:46.744 [2024-04-18 11:44:37.040530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.004 NEW_FUNC[1/671]: 0x54ffa0 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:47.004 NEW_FUNC[2/671]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:47.004 [2024-04-18 11:44:37.401301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.401356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.004 [2024-04-18 11:44:37.401426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.401444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.004 #10 NEW cov: 11758 ft: 11731 corp: 2/15b lim: 35 exec/s: 0 rss: 217Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:08:47.004 [2024-04-18 11:44:37.467827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.467872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.004 [2024-04-18 11:44:37.467941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.467959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.004 [2024-04-18 11:44:37.468011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.468027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.004 [2024-04-18 11:44:37.517938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.517978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.004 [2024-04-18 11:44:37.518054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.518071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.004 [2024-04-18 11:44:37.518125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.004 [2024-04-18 11:44:37.518141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.004 #12 NEW cov: 11782 ft: 12535 corp: 3/38b lim: 35 exec/s: 0 rss: 218Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:47.265 [2024-04-18 11:44:37.572587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.572630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.572685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.572703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.612653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.612687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.612744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.612761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.265 #14 NEW cov: 11794 ft: 12767 corp: 4/52b lim: 35 exec/s: 0 rss: 220Mb L: 14/23 MS: 1 ChangeBit- 00:08:47.265 [2024-04-18 11:44:37.671411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100c2c2 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.671451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.671507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.671524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.671596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.671613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.721591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100c2c2 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.721624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.721678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.721696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.721747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.721763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.265 #16 NEW cov: 11880 ft: 13019 corp: 5/75b lim: 35 exec/s: 0 rss: 222Mb L: 23/23 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:47.265 [2024-04-18 11:44:37.773746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.773781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.265 [2024-04-18 11:44:37.773854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.265 [2024-04-18 11:44:37.773872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.525 [2024-04-18 11:44:37.823859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.525 [2024-04-18 11:44:37.823894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.525 [2024-04-18 11:44:37.823951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.525 [2024-04-18 11:44:37.823968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.525 #18 NEW cov: 11880 ft: 13063 corp: 6/89b lim: 35 exec/s: 0 rss: 223Mb L: 14/23 MS: 1 CopyPart- 00:08:47.526 [2024-04-18 11:44:37.881115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:37.881150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:37.881224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:37.881243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:37.921216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:37.921248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:37.921321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:37.921339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.526 #20 NEW cov: 11880 ft: 13178 corp: 7/103b lim: 35 exec/s: 0 rss: 225Mb L: 14/23 MS: 1 CopyPart- 00:08:47.526 [2024-04-18 11:44:37.980199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:37.980234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:37.980296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:37.980318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:37.980391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13271313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:37.980408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:38.020331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:38.020364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:38.020444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:38.020462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.526 [2024-04-18 11:44:38.020516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13271313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.526 [2024-04-18 11:44:38.020549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.526 #22 NEW cov: 11880 ft: 13276 corp: 8/126b lim: 35 exec/s: 22 rss: 226Mb L: 23/23 MS: 1 ChangeByte- 00:08:47.786 [2024-04-18 11:44:38.080353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.080390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.080473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.080492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.080547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.080563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.080616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.080632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.120462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.120500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.120556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.120572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.120627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.120643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.120710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.120727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.786 #29 NEW cov: 11880 ft: 13690 corp: 9/154b lim: 35 exec/s: 29 rss: 227Mb L: 28/28 MS: 1 CopyPart- 00:08:47.786 [2024-04-18 11:44:38.181845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:6363c263 cdw11:63630002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.181879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.181934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63630002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.181951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.182008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:63c26363 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.182024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.182076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.182092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.221927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:6363c263 cdw11:63630002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.221958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.222015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63630002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.222032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.222091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:63c26363 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.222108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.786 [2024-04-18 11:44:38.222160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.786 [2024-04-18 11:44:38.222176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.786 #36 NEW cov: 11880 ft: 13754 corp: 10/184b lim: 35 exec/s: 36 rss: 228Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:47.786 [2024-04-18 11:44:38.281064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.281099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.787 [2024-04-18 11:44:38.281156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c3 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.281174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.787 [2024-04-18 11:44:38.281232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.281249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.787 [2024-04-18 11:44:38.281306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c2c20ac2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.281323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:47.787 [2024-04-18 11:44:38.331153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.331185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:47.787 [2024-04-18 11:44:38.331244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c3 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.331277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:47.787 [2024-04-18 11:44:38.331334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.331350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:47.787 [2024-04-18 11:44:38.331404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c2c20ac2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:47.787 [2024-04-18 11:44:38.331427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.047 #38 NEW cov: 11880 ft: 13810 corp: 11/212b lim: 35 exec/s: 38 rss: 230Mb L: 28/30 MS: 1 CopyPart- 00:08:48.047 [2024-04-18 11:44:38.393518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.393552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.393606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c3 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.393623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.393680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.393695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.393746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.393762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.443644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.443676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.443731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c3 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.443747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.443805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.443822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.443876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.443891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.047 #40 NEW cov: 11880 ft: 13858 corp: 12/240b lim: 35 exec/s: 40 rss: 231Mb L: 28/30 MS: 1 ChangeBit- 00:08:48.047 [2024-04-18 11:44:38.492995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c3 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.493030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.493087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.493104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.493158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.493174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.493227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c2c20ac2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.493245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.543077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c3 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.543110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.543165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.543181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.543255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.543272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.047 [2024-04-18 11:44:38.543330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c2c20ac2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.047 [2024-04-18 11:44:38.543346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.047 #42 NEW cov: 11880 ft: 13874 corp: 13/268b lim: 35 exec/s: 42 rss: 232Mb L: 28/30 MS: 1 ShuffleBytes- 00:08:48.307 [2024-04-18 11:44:38.605518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:38c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.307 [2024-04-18 11:44:38.605555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.307 [2024-04-18 11:44:38.605626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.307 [2024-04-18 11:44:38.605644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.307 [2024-04-18 11:44:38.605706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.307 [2024-04-18 11:44:38.605723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.307 [2024-04-18 11:44:38.645591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:38c2c2c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.307 [2024-04-18 11:44:38.645624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.307 [2024-04-18 11:44:38.645681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.307 [2024-04-18 11:44:38.645698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.307 [2024-04-18 11:44:38.645752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.307 [2024-04-18 11:44:38.645768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.307 #44 NEW cov: 11880 ft: 13903 corp: 14/291b lim: 35 exec/s: 44 rss: 234Mb L: 23/30 MS: 1 ChangeBinInt- 00:08:48.308 [2024-04-18 11:44:38.706155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.308 [2024-04-18 11:44:38.706190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.308 [2024-04-18 11:44:38.746256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.308 [2024-04-18 11:44:38.746288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.308 #46 NEW cov: 11880 ft: 14627 corp: 15/298b lim: 35 exec/s: 46 rss: 235Mb L: 7/30 MS: 1 EraseBytes- 00:08:48.308 [2024-04-18 11:44:38.794221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c221 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.308 [2024-04-18 11:44:38.794254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.308 [2024-04-18 11:44:38.794310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.308 [2024-04-18 11:44:38.794327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.308 [2024-04-18 11:44:38.844319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c2c2c221 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.308 [2024-04-18 11:44:38.844352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.308 [2024-04-18 11:44:38.844408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c3c2 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.308 [2024-04-18 11:44:38.844431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.568 #48 NEW cov: 11880 ft: 14823 corp: 16/313b lim: 35 exec/s: 48 rss: 236Mb L: 15/30 MS: 1 InsertByte- 00:08:48.568 [2024-04-18 11:44:38.905894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100c2c2 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.905929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:38.905989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.906007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:38.906061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.906077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:38.906128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c3c21313 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.906144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:38.956068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0100c2c2 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.956100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:38.956154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c2c2c2c2 cdw11:c2c20000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.956171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:38.956225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:13131313 cdw11:13130000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.956244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:38.956298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c3c21313 cdw11:c2c20003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:38.956314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:48.568 #50 NEW cov: 11880 ft: 14858 corp: 17/342b lim: 35 exec/s: 50 rss: 238Mb L: 29/30 MS: 1 CrossOver- 00:08:48.568 [2024-04-18 11:44:39.006927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:39.006962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.568 [2024-04-18 11:44:39.047020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000101 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:48.568 [2024-04-18 11:44:39.047052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:48.568 #54 NEW cov: 11880 ft: 14879 corp: 18/352b lim: 35 exec/s: 27 rss: 239Mb L: 10/30 MS: 3 ShuffleBytes-PersAutoDict-CopyPart- DE: "\001\000\000\000"- 00:08:48.568 #54 DONE cov: 11880 ft: 14879 corp: 18/352b lim: 35 exec/s: 27 rss: 239Mb 00:08:48.568 ###### Recommended dictionary. ###### 00:08:48.568 "\001\000\000\000" # Uses: 2 00:08:48.568 ###### End of recommended dictionary. ###### 00:08:48.568 Done 54 runs in 2 second(s) 00:08:49.138 11:44:39 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:49.138 11:44:39 -- ../common.sh@72 -- # (( i++ )) 00:08:49.138 11:44:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.138 11:44:39 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:49.138 11:44:39 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:49.138 11:44:39 -- nvmf/run.sh@24 -- # local timen=1 00:08:49.138 11:44:39 -- nvmf/run.sh@25 -- # local core=0x1 00:08:49.138 11:44:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:49.138 11:44:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:49.138 11:44:39 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:49.138 11:44:39 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:49.138 11:44:39 -- nvmf/run.sh@34 -- # printf %02d 5 00:08:49.138 11:44:39 -- nvmf/run.sh@34 -- # port=4405 00:08:49.138 11:44:39 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:49.138 11:44:39 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:49.138 11:44:39 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:49.138 11:44:39 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:49.138 11:44:39 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:49.138 11:44:39 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:49.138 [2024-04-18 11:44:39.566940] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:49.138 [2024-04-18 11:44:39.567051] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid367571 ] 00:08:49.138 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.398 [2024-04-18 11:44:39.835440] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.658 [2024-04-18 11:44:39.988467] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.923 [2024-04-18 11:44:40.239564] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:49.923 [2024-04-18 11:44:40.255772] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:49.923 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.923 INFO: Seed: 300101337 00:08:49.923 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:49.923 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:49.923 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:49.923 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.923 #2 INITED exec/s: 0 rss: 200Mb 00:08:49.923 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.923 This may also happen if the target rejected all inputs we tried so far 00:08:49.923 [2024-04-18 11:44:40.334140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.923 [2024-04-18 11:44:40.334187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:49.923 [2024-04-18 11:44:40.334273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.923 [2024-04-18 11:44:40.334292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:49.923 [2024-04-18 11:44:40.334375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.923 [2024-04-18 11:44:40.334398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:49.923 [2024-04-18 11:44:40.334507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:49.923 [2024-04-18 11:44:40.334527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.196 NEW_FUNC[1/671]: 0x5524f0 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:50.196 NEW_FUNC[2/671]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.196 [2024-04-18 11:44:40.705162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.196 [2024-04-18 11:44:40.705226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.196 [2024-04-18 11:44:40.705328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.196 [2024-04-18 11:44:40.705354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.196 [2024-04-18 11:44:40.705450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.196 [2024-04-18 11:44:40.705474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.196 [2024-04-18 11:44:40.705562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.196 [2024-04-18 11:44:40.705587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.196 #14 NEW cov: 11769 ft: 11742 corp: 2/45b lim: 45 exec/s: 0 rss: 216Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:08:50.466 [2024-04-18 11:44:40.775204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.775242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.775346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3bffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.775365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.775458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.775479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.775558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.775578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.775668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.775687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.835412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.835449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.835546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3bffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.835566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.835661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.835681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.835768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.466 [2024-04-18 11:44:40.835788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.466 [2024-04-18 11:44:40.835875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.835896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:50.467 #21 NEW cov: 11793 ft: 12354 corp: 3/90b lim: 45 exec/s: 0 rss: 219Mb L: 45/45 MS: 1 InsertByte- 00:08:50.467 [2024-04-18 11:44:40.908075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.908110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.467 [2024-04-18 11:44:40.908211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.908235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.467 [2024-04-18 11:44:40.908322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.908343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.467 [2024-04-18 11:44:40.908443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.908468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.467 [2024-04-18 11:44:40.958126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.958156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.467 [2024-04-18 11:44:40.958260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.958278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.467 [2024-04-18 11:44:40.958368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.958388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.467 [2024-04-18 11:44:40.958475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.467 [2024-04-18 11:44:40.958495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.467 #23 NEW cov: 11805 ft: 12552 corp: 4/134b lim: 45 exec/s: 0 rss: 220Mb L: 44/45 MS: 1 ChangeBit- 00:08:50.733 [2024-04-18 11:44:41.029165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.029198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.029293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3bffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.029312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.029404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.029429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.029519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.029540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.029626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.029646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.089458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.089497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.089587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3bffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.089606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.089696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.089714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.089795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.089815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.089898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.089917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:50.733 #25 NEW cov: 11891 ft: 12897 corp: 5/179b lim: 45 exec/s: 0 rss: 222Mb L: 45/45 MS: 1 CopyPart- 00:08:50.733 [2024-04-18 11:44:41.157315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00612b00 cdw11:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.157349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.217592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00612b00 cdw11:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.217623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.733 #29 NEW cov: 11891 ft: 14033 corp: 6/188b lim: 45 exec/s: 0 rss: 223Mb L: 9/45 MS: 3 ChangeBit-ChangeBit-CMP- DE: "\000\000a\200\000\000Ul"- 00:08:50.733 [2024-04-18 11:44:41.283314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.283351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.283447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3bffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.283469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.283558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:21ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.283578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.283661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.283682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.733 [2024-04-18 11:44:41.283767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.733 [2024-04-18 11:44:41.283791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.333584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.333616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.333714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3bffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.333734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.333826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:21ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.333845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.333923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.333943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.334030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.334050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:50.993 #31 NEW cov: 11891 ft: 14110 corp: 7/233b lim: 45 exec/s: 31 rss: 225Mb L: 45/45 MS: 1 ChangeByte- 00:08:50.993 [2024-04-18 11:44:41.398155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.398189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.398292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.398311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.448697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.448730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.993 [2024-04-18 11:44:41.448831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.993 [2024-04-18 11:44:41.448851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:50.993 #33 NEW cov: 11891 ft: 14411 corp: 8/254b lim: 45 exec/s: 33 rss: 226Mb L: 21/45 MS: 1 InsertRepeatedBytes- 00:08:50.994 [2024-04-18 11:44:41.515605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.994 [2024-04-18 11:44:41.515647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:50.994 [2024-04-18 11:44:41.515749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.994 [2024-04-18 11:44:41.515773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.253 [2024-04-18 11:44:41.575478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.253 [2024-04-18 11:44:41.575517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.253 [2024-04-18 11:44:41.575617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.253 [2024-04-18 11:44:41.575638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.253 #35 NEW cov: 11891 ft: 14440 corp: 9/275b lim: 45 exec/s: 35 rss: 227Mb L: 21/45 MS: 1 ShuffleBytes- 00:08:51.253 [2024-04-18 11:44:41.647598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00610a00 cdw11:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.253 [2024-04-18 11:44:41.647645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.253 [2024-04-18 11:44:41.697818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00610a00 cdw11:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.253 [2024-04-18 11:44:41.697849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.253 #37 NEW cov: 11891 ft: 14451 corp: 10/284b lim: 45 exec/s: 37 rss: 229Mb L: 9/45 MS: 1 PersAutoDict- DE: "\000\000a\200\000\000Ul"- 00:08:51.253 [2024-04-18 11:44:41.774407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.253 [2024-04-18 11:44:41.774453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.253 [2024-04-18 11:44:41.774544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.254 [2024-04-18 11:44:41.774567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.254 [2024-04-18 11:44:41.774664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.254 [2024-04-18 11:44:41.774686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:41.824508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:41.824541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:41.824639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:41.824659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:41.824743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:41.824763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.513 #39 NEW cov: 11891 ft: 14711 corp: 11/314b lim: 45 exec/s: 39 rss: 229Mb L: 30/45 MS: 1 InsertRepeatedBytes- 00:08:51.513 [2024-04-18 11:44:41.896318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:41.896365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:41.896465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:41.896489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:41.956751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:41.956785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:41.956888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:41.956911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.513 #46 NEW cov: 11891 ft: 14725 corp: 12/335b lim: 45 exec/s: 46 rss: 231Mb L: 21/45 MS: 1 CrossOver- 00:08:51.513 [2024-04-18 11:44:42.028616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:42.028660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:42.028769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:42.028794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:42.028891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:42.028926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.513 [2024-04-18 11:44:42.029019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.513 [2024-04-18 11:44:42.029044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.513 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:51.774 [2024-04-18 11:44:42.088842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.088878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.088968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.088989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.089074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.089096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.089182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.089201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.774 #48 NEW cov: 11908 ft: 14769 corp: 13/379b lim: 45 exec/s: 48 rss: 232Mb L: 44/45 MS: 1 CopyPart- 00:08:51.774 [2024-04-18 11:44:42.161542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.161585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.161679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.161704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.211800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.211832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.211913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.211931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.774 #50 NEW cov: 11908 ft: 14792 corp: 14/401b lim: 45 exec/s: 50 rss: 234Mb L: 22/45 MS: 1 CopyPart- 00:08:51.774 [2024-04-18 11:44:42.272831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:f7ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.272865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.272956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.272979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.273064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.273085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.273179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.273202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.323207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:f7ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.323239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.323342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.323362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.323460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.323482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:51.774 [2024-04-18 11:44:42.323583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.774 [2024-04-18 11:44:42.323603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:52.034 #52 NEW cov: 11908 ft: 14875 corp: 15/445b lim: 45 exec/s: 26 rss: 235Mb L: 44/45 MS: 1 ChangeBit- 00:08:52.034 #52 DONE cov: 11908 ft: 14875 corp: 15/445b lim: 45 exec/s: 26 rss: 235Mb 00:08:52.034 ###### Recommended dictionary. ###### 00:08:52.034 "\000\000a\200\000\000Ul" # Uses: 1 00:08:52.034 ###### End of recommended dictionary. ###### 00:08:52.034 Done 52 runs in 2 second(s) 00:08:52.293 11:44:42 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:52.293 11:44:42 -- ../common.sh@72 -- # (( i++ )) 00:08:52.293 11:44:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.293 11:44:42 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:52.293 11:44:42 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:52.293 11:44:42 -- nvmf/run.sh@24 -- # local timen=1 00:08:52.293 11:44:42 -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.293 11:44:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:52.293 11:44:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:52.293 11:44:42 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:52.293 11:44:42 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:52.293 11:44:42 -- nvmf/run.sh@34 -- # printf %02d 6 00:08:52.293 11:44:42 -- nvmf/run.sh@34 -- # port=4406 00:08:52.293 11:44:42 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:52.293 11:44:42 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:52.293 11:44:42 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.293 11:44:42 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.293 11:44:42 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:52.293 11:44:42 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:52.552 [2024-04-18 11:44:42.845138] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:52.552 [2024-04-18 11:44:42.845230] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid367955 ] 00:08:52.552 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.552 [2024-04-18 11:44:43.102026] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.811 [2024-04-18 11:44:43.256136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.071 [2024-04-18 11:44:43.507191] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:53.071 [2024-04-18 11:44:43.523393] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:53.071 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.071 INFO: Seed: 3568185471 00:08:53.071 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:53.071 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:53.071 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:53.071 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.071 #2 INITED exec/s: 0 rss: 199Mb 00:08:53.071 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.071 This may also happen if the target rejected all inputs we tried so far 00:08:53.071 [2024-04-18 11:44:43.579006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:53.071 [2024-04-18 11:44:43.579057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.591 NEW_FUNC[1/669]: 0x555180 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:53.591 NEW_FUNC[2/669]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:53.591 [2024-04-18 11:44:43.949910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:53.591 [2024-04-18 11:44:43.949976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.591 #8 NEW cov: 11676 ft: 11653 corp: 2/3b lim: 10 exec/s: 0 rss: 217Mb L: 2/2 MS: 5 ChangeByte-ChangeBit-CopyPart-ChangeBinInt-CopyPart- 00:08:53.591 [2024-04-18 11:44:44.002967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e0a cdw11:00000000 00:08:53.591 [2024-04-18 11:44:44.003027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.591 [2024-04-18 11:44:44.052997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e0a cdw11:00000000 00:08:53.591 [2024-04-18 11:44:44.053038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.591 #11 NEW cov: 11710 ft: 12196 corp: 3/5b lim: 10 exec/s: 0 rss: 218Mb L: 2/2 MS: 2 CopyPart-InsertByte- 00:08:53.591 [2024-04-18 11:44:44.099340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cccc cdw11:00000000 00:08:53.591 [2024-04-18 11:44:44.099382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.591 [2024-04-18 11:44:44.099434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ccd8 cdw11:00000000 00:08:53.591 [2024-04-18 11:44:44.099455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.852 [2024-04-18 11:44:44.169515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cccc cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.169557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.852 [2024-04-18 11:44:44.169621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ccd8 cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.169642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.852 #13 NEW cov: 11722 ft: 12912 corp: 4/10b lim: 10 exec/s: 0 rss: 220Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:53.852 [2024-04-18 11:44:44.214790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.214826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.852 [2024-04-18 11:44:44.264877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.264909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.852 #15 NEW cov: 11808 ft: 13179 corp: 5/12b lim: 10 exec/s: 0 rss: 222Mb L: 2/5 MS: 1 CopyPart- 00:08:53.852 [2024-04-18 11:44:44.326126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.326160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.852 [2024-04-18 11:44:44.326219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.326239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.852 [2024-04-18 11:44:44.376191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.376221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:53.852 [2024-04-18 11:44:44.376297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:53.852 [2024-04-18 11:44:44.376314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:53.852 #17 NEW cov: 11808 ft: 13243 corp: 6/17b lim: 10 exec/s: 0 rss: 223Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:08:54.112 [2024-04-18 11:44:44.422985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.423020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.423077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.423094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.473089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.473121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.473182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.473199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.112 #19 NEW cov: 11808 ft: 13303 corp: 7/22b lim: 10 exec/s: 0 rss: 225Mb L: 5/5 MS: 1 ChangeByte- 00:08:54.112 [2024-04-18 11:44:44.517633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cccc cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.517668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.517722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.517738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.567784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cccc cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.567817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.567870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.567888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.112 #21 NEW cov: 11808 ft: 13414 corp: 8/26b lim: 10 exec/s: 21 rss: 227Mb L: 4/5 MS: 1 EraseBytes- 00:08:54.112 [2024-04-18 11:44:44.615194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.615229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.615281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.615298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.655221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.655254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.112 [2024-04-18 11:44:44.655310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.112 [2024-04-18 11:44:44.655327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.373 #23 NEW cov: 11808 ft: 13429 corp: 9/30b lim: 10 exec/s: 23 rss: 228Mb L: 4/5 MS: 1 CopyPart- 00:08:54.373 [2024-04-18 11:44:44.700630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.700664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.700720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.700737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.700789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.700808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.750868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.750900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.750970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.750987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.751039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.751056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.373 #25 NEW cov: 11808 ft: 13681 corp: 10/36b lim: 10 exec/s: 25 rss: 229Mb L: 6/6 MS: 1 CrossOver- 00:08:54.373 [2024-04-18 11:44:44.798600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cccc cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.798635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.798695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006c76 cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.798713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.798764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006f6c cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.798781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.798832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ccd8 cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.798848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.838676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cccc cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.838712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.838772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006c76 cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.838789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.838840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006f6c cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.838857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:54.373 [2024-04-18 11:44:44.838909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ccd8 cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.838925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:54.373 #27 NEW cov: 11808 ft: 13917 corp: 11/45b lim: 10 exec/s: 27 rss: 231Mb L: 9/9 MS: 1 CMP- DE: "lvol"- 00:08:54.373 [2024-04-18 11:44:44.885717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001041 cdw11:00000000 00:08:54.373 [2024-04-18 11:44:44.885757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.633 [2024-04-18 11:44:44.925878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001041 cdw11:00000000 00:08:54.633 [2024-04-18 11:44:44.925911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.633 #32 NEW cov: 11808 ft: 14078 corp: 12/47b lim: 10 exec/s: 32 rss: 232Mb L: 2/9 MS: 4 ChangeBit-CopyPart-ChangeBinInt-InsertByte- 00:08:54.633 [2024-04-18 11:44:44.971865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00000000 00:08:54.633 [2024-04-18 11:44:44.971899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.633 [2024-04-18 11:44:45.011939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00000000 00:08:54.633 [2024-04-18 11:44:45.011972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.633 #34 NEW cov: 11808 ft: 14155 corp: 13/49b lim: 10 exec/s: 34 rss: 232Mb L: 2/9 MS: 1 ChangeBit- 00:08:54.633 [2024-04-18 11:44:45.059175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:54.634 [2024-04-18 11:44:45.059208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.634 [2024-04-18 11:44:45.099286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:54.634 [2024-04-18 11:44:45.099318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.634 #36 NEW cov: 11808 ft: 14188 corp: 14/51b lim: 10 exec/s: 36 rss: 233Mb L: 2/9 MS: 1 CopyPart- 00:08:54.634 [2024-04-18 11:44:45.157157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006c76 cdw11:00000000 00:08:54.634 [2024-04-18 11:44:45.157190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.634 [2024-04-18 11:44:45.157252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006f6c cdw11:00000000 00:08:54.634 [2024-04-18 11:44:45.157269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.893 [2024-04-18 11:44:45.197264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00006c76 cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.197300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.893 [2024-04-18 11:44:45.197354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006f6c cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.197371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.893 #39 NEW cov: 11808 ft: 14226 corp: 15/56b lim: 10 exec/s: 39 rss: 235Mb L: 5/9 MS: 2 ShuffleBytes-PersAutoDict- DE: "lvol"- 00:08:54.893 [2024-04-18 11:44:45.243547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.243581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.893 [2024-04-18 11:44:45.243635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.243652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.893 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:54.893 [2024-04-18 11:44:45.283651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.283682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.893 [2024-04-18 11:44:45.283736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.283753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:54.893 #41 NEW cov: 11825 ft: 14252 corp: 16/60b lim: 10 exec/s: 41 rss: 236Mb L: 4/9 MS: 1 EraseBytes- 00:08:54.893 [2024-04-18 11:44:45.335633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.335667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.893 [2024-04-18 11:44:45.385765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00000000 00:08:54.893 [2024-04-18 11:44:45.385797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:54.893 #43 NEW cov: 11825 ft: 14275 corp: 17/62b lim: 10 exec/s: 43 rss: 237Mb L: 2/9 MS: 1 ShuffleBytes- 00:08:55.154 [2024-04-18 11:44:45.447057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.447090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.154 [2024-04-18 11:44:45.447150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.447167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.154 [2024-04-18 11:44:45.487119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.487151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.154 [2024-04-18 11:44:45.487205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.487222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.154 #45 NEW cov: 11825 ft: 14353 corp: 18/67b lim: 10 exec/s: 45 rss: 238Mb L: 5/9 MS: 1 CopyPart- 00:08:55.154 [2024-04-18 11:44:45.534489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e0a cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.534523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.154 [2024-04-18 11:44:45.534576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002e0a cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.534592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.154 [2024-04-18 11:44:45.574578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e0a cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.574609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:55.154 [2024-04-18 11:44:45.574664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002e0a cdw11:00000000 00:08:55.154 [2024-04-18 11:44:45.574681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:55.154 #47 NEW cov: 11825 ft: 14384 corp: 19/71b lim: 10 exec/s: 23 rss: 240Mb L: 4/9 MS: 1 CopyPart- 00:08:55.154 #47 DONE cov: 11825 ft: 14384 corp: 19/71b lim: 10 exec/s: 23 rss: 240Mb 00:08:55.154 ###### Recommended dictionary. ###### 00:08:55.154 "lvol" # Uses: 1 00:08:55.154 ###### End of recommended dictionary. ###### 00:08:55.154 Done 47 runs in 2 second(s) 00:08:55.724 11:44:46 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:55.724 11:44:46 -- ../common.sh@72 -- # (( i++ )) 00:08:55.724 11:44:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.724 11:44:46 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:55.724 11:44:46 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:55.724 11:44:46 -- nvmf/run.sh@24 -- # local timen=1 00:08:55.724 11:44:46 -- nvmf/run.sh@25 -- # local core=0x1 00:08:55.724 11:44:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:55.724 11:44:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:55.724 11:44:46 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:55.724 11:44:46 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:55.724 11:44:46 -- nvmf/run.sh@34 -- # printf %02d 7 00:08:55.724 11:44:46 -- nvmf/run.sh@34 -- # port=4407 00:08:55.724 11:44:46 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:55.724 11:44:46 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:55.724 11:44:46 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:55.724 11:44:46 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.724 11:44:46 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:55.724 11:44:46 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:55.724 [2024-04-18 11:44:46.087577] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:55.724 [2024-04-18 11:44:46.087685] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid368449 ] 00:08:55.724 EAL: No free 2048 kB hugepages reported on node 1 00:08:55.984 [2024-04-18 11:44:46.354797] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.984 [2024-04-18 11:44:46.506928] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.243 [2024-04-18 11:44:46.751736] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:56.243 [2024-04-18 11:44:46.767959] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:56.243 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.243 INFO: Seed: 2517121351 00:08:56.502 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:56.502 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:56.502 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:56.502 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.502 #2 INITED exec/s: 0 rss: 199Mb 00:08:56.502 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.502 This may also happen if the target rejected all inputs we tried so far 00:08:56.502 [2024-04-18 11:44:46.823531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b631 cdw11:00000000 00:08:56.502 [2024-04-18 11:44:46.823575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.762 NEW_FUNC[1/669]: 0x555c70 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:56.762 NEW_FUNC[2/669]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:56.762 [2024-04-18 11:44:47.184533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b631 cdw11:00000000 00:08:56.762 [2024-04-18 11:44:47.184598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.762 #7 NEW cov: 11686 ft: 11659 corp: 2/3b lim: 10 exec/s: 0 rss: 217Mb L: 2/2 MS: 4 ChangeByte-CopyPart-ShuffleBytes-InsertByte- 00:08:56.762 [2024-04-18 11:44:47.250128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e8e cdw11:00000000 00:08:56.762 [2024-04-18 11:44:47.250187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:56.762 [2024-04-18 11:44:47.310351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e8e cdw11:00000000 00:08:56.762 [2024-04-18 11:44:47.310394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.022 #13 NEW cov: 11710 ft: 12191 corp: 3/5b lim: 10 exec/s: 0 rss: 218Mb L: 2/2 MS: 5 ChangeBit-ShuffleBytes-ChangeBit-ShuffleBytes-CopyPart- 00:08:57.022 [2024-04-18 11:44:47.355655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:57.022 [2024-04-18 11:44:47.355697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.022 [2024-04-18 11:44:47.405784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:57.022 [2024-04-18 11:44:47.405824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.022 #15 NEW cov: 11722 ft: 12574 corp: 4/7b lim: 10 exec/s: 0 rss: 220Mb L: 2/2 MS: 1 CrossOver- 00:08:57.022 [2024-04-18 11:44:47.451824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000025b6 cdw11:00000000 00:08:57.022 [2024-04-18 11:44:47.451866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.022 [2024-04-18 11:44:47.521977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000025b6 cdw11:00000000 00:08:57.022 [2024-04-18 11:44:47.522024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.022 #17 NEW cov: 11808 ft: 12767 corp: 5/10b lim: 10 exec/s: 0 rss: 221Mb L: 3/3 MS: 1 InsertByte- 00:08:57.022 [2024-04-18 11:44:47.567982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e0a cdw11:00000000 00:08:57.022 [2024-04-18 11:44:47.568034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.282 [2024-04-18 11:44:47.617977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e0a cdw11:00000000 00:08:57.282 [2024-04-18 11:44:47.618011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.282 #20 NEW cov: 11808 ft: 12880 corp: 6/12b lim: 10 exec/s: 0 rss: 223Mb L: 2/3 MS: 2 EraseBytes-CrossOver- 00:08:57.282 [2024-04-18 11:44:47.664102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b631 cdw11:00000000 00:08:57.282 [2024-04-18 11:44:47.664137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.282 [2024-04-18 11:44:47.704210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b631 cdw11:00000000 00:08:57.282 [2024-04-18 11:44:47.704241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.282 #22 NEW cov: 11808 ft: 13055 corp: 7/14b lim: 10 exec/s: 0 rss: 224Mb L: 2/3 MS: 1 CopyPart- 00:08:57.282 [2024-04-18 11:44:47.750238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007ab6 cdw11:00000000 00:08:57.282 [2024-04-18 11:44:47.750270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.282 [2024-04-18 11:44:47.790351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007ab6 cdw11:00000000 00:08:57.282 [2024-04-18 11:44:47.790382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.282 #24 NEW cov: 11808 ft: 13140 corp: 8/17b lim: 10 exec/s: 24 rss: 225Mb L: 3/3 MS: 1 InsertByte- 00:08:57.542 [2024-04-18 11:44:47.836063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007ab6 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:47.836097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:47.836152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004431 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:47.836169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:47.886183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007ab6 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:47.886213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:47.886266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004431 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:47.886283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.542 #26 NEW cov: 11808 ft: 13392 corp: 9/21b lim: 10 exec/s: 26 rss: 227Mb L: 4/4 MS: 1 InsertByte- 00:08:57.542 [2024-04-18 11:44:47.931151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005db6 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:47.931185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:47.981380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005db6 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:47.981421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.542 #28 NEW cov: 11808 ft: 13435 corp: 10/24b lim: 10 exec/s: 28 rss: 228Mb L: 3/4 MS: 1 InsertByte- 00:08:57.542 [2024-04-18 11:44:48.042194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.042230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.042286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.042304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.042360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.042376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.042433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.042449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.042502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000d92e cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.042518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.092323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.092358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.092426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.092444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.092496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.092513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.092563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.092580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:57.542 [2024-04-18 11:44:48.092632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000d92e cdw11:00000000 00:08:57.542 [2024-04-18 11:44:48.092649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:57.802 #33 NEW cov: 11808 ft: 13802 corp: 11/34b lim: 10 exec/s: 33 rss: 230Mb L: 10/10 MS: 4 EraseBytes-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:57.802 [2024-04-18 11:44:48.152669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004ad4 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.152708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.802 [2024-04-18 11:44:48.192724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004ad4 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.192759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.802 #35 NEW cov: 11808 ft: 13863 corp: 12/36b lim: 10 exec/s: 35 rss: 230Mb L: 2/10 MS: 1 ChangeBinInt- 00:08:57.802 [2024-04-18 11:44:48.248244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005dd9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.248284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.802 [2024-04-18 11:44:48.248338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.248371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.802 [2024-04-18 11:44:48.298423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005dd9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.298470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.802 [2024-04-18 11:44:48.298523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.298539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.802 #37 NEW cov: 11808 ft: 13868 corp: 13/40b lim: 10 exec/s: 37 rss: 233Mb L: 4/10 MS: 1 CrossOver- 00:08:57.802 [2024-04-18 11:44:48.344713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.344748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:57.802 [2024-04-18 11:44:48.344803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.344819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:57.802 [2024-04-18 11:44:48.344877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.344893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:57.802 [2024-04-18 11:44:48.344943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:57.802 [2024-04-18 11:44:48.344958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.394843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.394876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.394930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.394947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.395001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.395017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.395066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000d9d9 cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.395082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.061 #39 NEW cov: 11808 ft: 13899 corp: 14/48b lim: 10 exec/s: 39 rss: 234Mb L: 8/10 MS: 1 EraseBytes- 00:08:58.061 [2024-04-18 11:44:48.442105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e8e cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.442137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.442194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.442211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.442260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.442277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.442334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.442350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.482219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e8e cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.482250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.061 [2024-04-18 11:44:48.482305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:58.061 [2024-04-18 11:44:48.482321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.062 [2024-04-18 11:44:48.482372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:58.062 [2024-04-18 11:44:48.482389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:58.062 [2024-04-18 11:44:48.482462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:58.062 [2024-04-18 11:44:48.482484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:58.062 #41 NEW cov: 11808 ft: 13968 corp: 15/56b lim: 10 exec/s: 41 rss: 235Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:08:58.062 [2024-04-18 11:44:48.528811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007ab6 cdw11:00000000 00:08:58.062 [2024-04-18 11:44:48.528845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.062 [2024-04-18 11:44:48.528899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000031d0 cdw11:00000000 00:08:58.062 [2024-04-18 11:44:48.528915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.062 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:58.062 [2024-04-18 11:44:48.568870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007ab6 cdw11:00000000 00:08:58.062 [2024-04-18 11:44:48.568902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.062 [2024-04-18 11:44:48.568956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000031d0 cdw11:00000000 00:08:58.062 [2024-04-18 11:44:48.568973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:58.062 #43 NEW cov: 11825 ft: 14004 corp: 16/60b lim: 10 exec/s: 43 rss: 236Mb L: 4/10 MS: 1 InsertByte- 00:08:58.321 [2024-04-18 11:44:48.627649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005db6 cdw11:00000000 00:08:58.321 [2024-04-18 11:44:48.627684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.321 [2024-04-18 11:44:48.667753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005db6 cdw11:00000000 00:08:58.321 [2024-04-18 11:44:48.667787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.321 #45 NEW cov: 11825 ft: 14020 corp: 17/63b lim: 10 exec/s: 45 rss: 237Mb L: 3/10 MS: 1 ChangeBit- 00:08:58.321 [2024-04-18 11:44:48.728058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e92 cdw11:00000000 00:08:58.321 [2024-04-18 11:44:48.728093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.321 [2024-04-18 11:44:48.768136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008e92 cdw11:00000000 00:08:58.321 [2024-04-18 11:44:48.768168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.321 #47 NEW cov: 11825 ft: 14036 corp: 18/65b lim: 10 exec/s: 47 rss: 238Mb L: 2/10 MS: 1 ChangeByte- 00:08:58.321 [2024-04-18 11:44:48.815052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b6c9 cdw11:00000000 00:08:58.321 [2024-04-18 11:44:48.815085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.321 [2024-04-18 11:44:48.855131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000b6c9 cdw11:00000000 00:08:58.321 [2024-04-18 11:44:48.855162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:58.581 #49 NEW cov: 11825 ft: 14039 corp: 19/67b lim: 10 exec/s: 24 rss: 239Mb L: 2/10 MS: 1 ChangeByte- 00:08:58.581 #49 DONE cov: 11825 ft: 14039 corp: 19/67b lim: 10 exec/s: 24 rss: 239Mb 00:08:58.581 Done 49 runs in 2 second(s) 00:08:58.841 11:44:49 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:58.841 11:44:49 -- ../common.sh@72 -- # (( i++ )) 00:08:58.841 11:44:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.841 11:44:49 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:58.841 11:44:49 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:58.841 11:44:49 -- nvmf/run.sh@24 -- # local timen=1 00:08:58.841 11:44:49 -- nvmf/run.sh@25 -- # local core=0x1 00:08:58.841 11:44:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:58.841 11:44:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:58.841 11:44:49 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:58.841 11:44:49 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:58.841 11:44:49 -- nvmf/run.sh@34 -- # printf %02d 8 00:08:58.841 11:44:49 -- nvmf/run.sh@34 -- # port=4408 00:08:58.841 11:44:49 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:58.841 11:44:49 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:58.841 11:44:49 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:58.841 11:44:49 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:58.841 11:44:49 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:58.841 11:44:49 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:58.841 [2024-04-18 11:44:49.366514] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:08:58.841 [2024-04-18 11:44:49.366624] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid368882 ] 00:08:59.101 EAL: No free 2048 kB hugepages reported on node 1 00:08:59.101 [2024-04-18 11:44:49.616698] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.361 [2024-04-18 11:44:49.769129] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:59.621 [2024-04-18 11:44:50.016427] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:59.621 [2024-04-18 11:44:50.032645] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:59.621 INFO: Running with entropic power schedule (0xFF, 100). 00:08:59.621 INFO: Seed: 1488105098 00:08:59.621 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:08:59.621 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:08:59.621 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:59.621 INFO: A corpus is not provided, starting from an empty corpus 00:08:59.621 [2024-04-18 11:44:50.110548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.621 [2024-04-18 11:44:50.110594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.621 #2 INITED cov: 11714 ft: 11687 corp: 1/1b exec/s: 0 rss: 209Mb 00:08:59.621 [2024-04-18 11:44:50.160462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.621 [2024-04-18 11:44:50.160501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.881 [2024-04-18 11:44:50.220692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.881 [2024-04-18 11:44:50.220729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.881 #4 NEW cov: 11843 ft: 12286 corp: 2/2b lim: 5 exec/s: 0 rss: 217Mb L: 1/1 MS: 1 ChangeBit- 00:08:59.881 [2024-04-18 11:44:50.301973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.881 [2024-04-18 11:44:50.302018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.881 [2024-04-18 11:44:50.302128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.881 [2024-04-18 11:44:50.302155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.881 [2024-04-18 11:44:50.362080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.881 [2024-04-18 11:44:50.362111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:59.881 [2024-04-18 11:44:50.362197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:59.881 [2024-04-18 11:44:50.362215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:59.881 #6 NEW cov: 11928 ft: 13325 corp: 3/4b lim: 5 exec/s: 0 rss: 218Mb L: 2/2 MS: 1 InsertByte- 00:09:00.141 [2024-04-18 11:44:50.436099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.436132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.436237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.436260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.436357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.436381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.436478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.436498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.436602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.436620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.486177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.486207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.486301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.486318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.486412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.486436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.486518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.486538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.486623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.486640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.141 #8 NEW cov: 11928 ft: 13815 corp: 4/9b lim: 5 exec/s: 0 rss: 220Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:09:00.141 [2024-04-18 11:44:50.560750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.560783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.560885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.560903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.560991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.561011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.561106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.561124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.561223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.561242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.621100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.621130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.621223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.621242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.141 [2024-04-18 11:44:50.621330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.141 [2024-04-18 11:44:50.621350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.142 [2024-04-18 11:44:50.621443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.142 [2024-04-18 11:44:50.621463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.142 [2024-04-18 11:44:50.621557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.142 [2024-04-18 11:44:50.621576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.142 #10 NEW cov: 11928 ft: 14052 corp: 5/14b lim: 5 exec/s: 0 rss: 222Mb L: 5/5 MS: 1 CopyPart- 00:09:00.142 [2024-04-18 11:44:50.690349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.142 [2024-04-18 11:44:50.690386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.750576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.750609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.402 #12 NEW cov: 11928 ft: 14204 corp: 6/15b lim: 5 exec/s: 0 rss: 223Mb L: 1/5 MS: 1 EraseBytes- 00:09:00.402 [2024-04-18 11:44:50.825546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.825580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.825680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.825700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.825795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.825815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.825901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.825921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.826013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.826032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.875746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.875778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.875875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.875895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.875991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.876012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.876103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.876124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.876214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.876235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.402 #14 NEW cov: 11928 ft: 14291 corp: 7/20b lim: 5 exec/s: 0 rss: 225Mb L: 5/5 MS: 1 CrossOver- 00:09:00.402 [2024-04-18 11:44:50.940326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.940362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.940464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.940485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.940578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.940599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.940687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.940709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.402 [2024-04-18 11:44:50.940796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.402 [2024-04-18 11:44:50.940816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:50.990526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:50.990558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:50.990661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:50.990681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:50.990771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:50.990791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:50.990880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:50.990899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:50.990992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:50.991011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.663 #16 NEW cov: 11928 ft: 14315 corp: 8/25b lim: 5 exec/s: 0 rss: 226Mb L: 5/5 MS: 1 ShuffleBytes- 00:09:00.663 [2024-04-18 11:44:51.066861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.066894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.066995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.067015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.067106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.067127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.067219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.067239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.067329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.067349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.117000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.117030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.117133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.117154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.117244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.117264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.117353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.117373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.117472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.117491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:00.663 #18 NEW cov: 11928 ft: 14376 corp: 9/30b lim: 5 exec/s: 18 rss: 227Mb L: 5/5 MS: 1 CopyPart- 00:09:00.663 [2024-04-18 11:44:51.188430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.188467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.188577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.188600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.663 [2024-04-18 11:44:51.188701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.663 [2024-04-18 11:44:51.188725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.238751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.238782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.238867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.238887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.238985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.239005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.923 #20 NEW cov: 11928 ft: 14545 corp: 10/33b lim: 5 exec/s: 20 rss: 229Mb L: 3/5 MS: 1 CrossOver- 00:09:00.923 [2024-04-18 11:44:51.297870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.297904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.297996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.298016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.298101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.298121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.348038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.348067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.348148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.348166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.348244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.348261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:00.923 #22 NEW cov: 11928 ft: 14599 corp: 11/36b lim: 5 exec/s: 22 rss: 230Mb L: 3/5 MS: 1 EraseBytes- 00:09:00.923 [2024-04-18 11:44:51.418283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.418317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:00.923 [2024-04-18 11:44:51.468438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:00.923 [2024-04-18 11:44:51.468473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.183 #24 NEW cov: 11928 ft: 14626 corp: 12/37b lim: 5 exec/s: 24 rss: 230Mb L: 1/5 MS: 1 ChangeByte- 00:09:01.183 [2024-04-18 11:44:51.541231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.541269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.183 [2024-04-18 11:44:51.541374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.541394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.183 [2024-04-18 11:44:51.591385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.591429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.183 [2024-04-18 11:44:51.591521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.591540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.183 #26 NEW cov: 11928 ft: 14660 corp: 13/39b lim: 5 exec/s: 26 rss: 231Mb L: 2/5 MS: 1 InsertByte- 00:09:01.183 [2024-04-18 11:44:51.662101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.662137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.183 [2024-04-18 11:44:51.662235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.662257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.183 [2024-04-18 11:44:51.722363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.722393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.183 [2024-04-18 11:44:51.722494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.183 [2024-04-18 11:44:51.722514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.443 #28 NEW cov: 11928 ft: 14697 corp: 14/41b lim: 5 exec/s: 28 rss: 233Mb L: 2/5 MS: 1 EraseBytes- 00:09:01.443 [2024-04-18 11:44:51.793194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.443 [2024-04-18 11:44:51.793235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.443 [2024-04-18 11:44:51.793338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.443 [2024-04-18 11:44:51.793362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.703 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:01.703 [2024-04-18 11:44:52.166665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.703 [2024-04-18 11:44:52.166715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:01.703 [2024-04-18 11:44:52.166822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:01.703 [2024-04-18 11:44:52.166841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:01.703 #30 NEW cov: 11945 ft: 14749 corp: 15/43b lim: 5 exec/s: 15 rss: 235Mb L: 2/5 MS: 1 ChangeBinInt- 00:09:01.703 #30 DONE cov: 11945 ft: 14749 corp: 15/43b lim: 5 exec/s: 15 rss: 235Mb 00:09:01.703 Done 30 runs in 2 second(s) 00:09:02.273 11:44:52 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:09:02.273 11:44:52 -- ../common.sh@72 -- # (( i++ )) 00:09:02.273 11:44:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:02.273 11:44:52 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:09:02.274 11:44:52 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:09:02.274 11:44:52 -- nvmf/run.sh@24 -- # local timen=1 00:09:02.274 11:44:52 -- nvmf/run.sh@25 -- # local core=0x1 00:09:02.274 11:44:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:02.274 11:44:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:09:02.274 11:44:52 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:02.274 11:44:52 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:02.274 11:44:52 -- nvmf/run.sh@34 -- # printf %02d 9 00:09:02.274 11:44:52 -- nvmf/run.sh@34 -- # port=4409 00:09:02.274 11:44:52 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:02.274 11:44:52 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:09:02.274 11:44:52 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:02.274 11:44:52 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:02.274 11:44:52 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:02.274 11:44:52 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:09:02.274 [2024-04-18 11:44:52.699202] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:02.274 [2024-04-18 11:44:52.699312] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid369305 ] 00:09:02.274 EAL: No free 2048 kB hugepages reported on node 1 00:09:02.534 [2024-04-18 11:44:52.971879] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.794 [2024-04-18 11:44:53.125061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.054 [2024-04-18 11:44:53.371526] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:03.054 [2024-04-18 11:44:53.387741] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:09:03.054 INFO: Running with entropic power schedule (0xFF, 100). 00:09:03.054 INFO: Seed: 545257086 00:09:03.054 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:03.054 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:03.054 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:09:03.054 INFO: A corpus is not provided, starting from an empty corpus 00:09:03.054 [2024-04-18 11:44:53.447350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.054 [2024-04-18 11:44:53.447388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.054 #2 INITED cov: 11714 ft: 11687 corp: 1/1b exec/s: 0 rss: 210Mb 00:09:03.054 [2024-04-18 11:44:53.487355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.054 [2024-04-18 11:44:53.487388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.054 [2024-04-18 11:44:53.537514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.054 [2024-04-18 11:44:53.537551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.054 #4 NEW cov: 11843 ft: 12356 corp: 2/2b lim: 5 exec/s: 0 rss: 217Mb L: 1/1 MS: 1 ChangeByte- 00:09:03.054 [2024-04-18 11:44:53.601888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.054 [2024-04-18 11:44:53.601926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.642003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.642035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.314 #6 NEW cov: 11928 ft: 12884 corp: 3/3b lim: 5 exec/s: 0 rss: 219Mb L: 1/1 MS: 1 CopyPart- 00:09:03.314 [2024-04-18 11:44:53.690646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.690679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.690739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.690756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.690815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.690832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.690890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.690906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.690960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.690976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.740708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.740739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.740796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.740813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.740873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.740888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.740957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.740973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.741033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.741049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.314 #8 NEW cov: 11928 ft: 13999 corp: 4/8b lim: 5 exec/s: 0 rss: 221Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:09:03.314 [2024-04-18 11:44:53.786560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.786605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.314 [2024-04-18 11:44:53.826593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.314 [2024-04-18 11:44:53.826625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.314 #10 NEW cov: 11928 ft: 14107 corp: 5/9b lim: 5 exec/s: 0 rss: 222Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:03.572 [2024-04-18 11:44:53.884767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.572 [2024-04-18 11:44:53.884801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.572 [2024-04-18 11:44:53.884861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.572 [2024-04-18 11:44:53.884877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.572 [2024-04-18 11:44:53.884935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.572 [2024-04-18 11:44:53.884951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.572 [2024-04-18 11:44:53.885004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.572 [2024-04-18 11:44:53.885019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.572 [2024-04-18 11:44:53.885074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.572 [2024-04-18 11:44:53.885090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.572 [2024-04-18 11:44:53.934942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.572 [2024-04-18 11:44:53.934977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:53.935034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:53.935051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:53.935121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:53.935138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:53.935205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:53.935221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:53.935296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:53.935313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.573 #12 NEW cov: 11928 ft: 14179 corp: 6/14b lim: 5 exec/s: 0 rss: 223Mb L: 5/5 MS: 1 ChangeBinInt- 00:09:03.573 [2024-04-18 11:44:53.988884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:53.988935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:54.028923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:54.028955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.573 #14 NEW cov: 11928 ft: 14269 corp: 7/15b lim: 5 exec/s: 0 rss: 224Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:03.573 [2024-04-18 11:44:54.085756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:54.085790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:54.085848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:54.085865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:54.085923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:54.085939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.573 [2024-04-18 11:44:54.085997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.573 [2024-04-18 11:44:54.086014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.832 [2024-04-18 11:44:54.135929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.832 [2024-04-18 11:44:54.135963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.832 [2024-04-18 11:44:54.136021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.832 [2024-04-18 11:44:54.136038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.832 [2024-04-18 11:44:54.136093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.832 [2024-04-18 11:44:54.136109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.136167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.136184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.833 #16 NEW cov: 11928 ft: 14309 corp: 8/19b lim: 5 exec/s: 0 rss: 226Mb L: 4/5 MS: 1 EraseBytes- 00:09:03.833 [2024-04-18 11:44:54.189592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.189638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.189717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.189755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.189825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.189848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.189920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.189943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.239604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.239637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.239696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.239712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.239767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.239783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.239841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.239858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.833 #18 NEW cov: 11928 ft: 14332 corp: 9/23b lim: 5 exec/s: 0 rss: 228Mb L: 4/5 MS: 1 ShuffleBytes- 00:09:03.833 [2024-04-18 11:44:54.299079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.299117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.299176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.299194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.299253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.299269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.299340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.299358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.299419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.299436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.339177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.339209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.339267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.339283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.339347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.339363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.339421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.339438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:03.833 [2024-04-18 11:44:54.339492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:03.833 [2024-04-18 11:44:54.339508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:03.833 #20 NEW cov: 11928 ft: 14379 corp: 10/28b lim: 5 exec/s: 0 rss: 229Mb L: 5/5 MS: 1 ChangeBit- 00:09:04.093 [2024-04-18 11:44:54.398595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.093 [2024-04-18 11:44:54.398631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.093 [2024-04-18 11:44:54.398696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.093 [2024-04-18 11:44:54.398714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.093 [2024-04-18 11:44:54.438710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.093 [2024-04-18 11:44:54.438743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.093 [2024-04-18 11:44:54.438802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.093 [2024-04-18 11:44:54.438819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.093 #22 NEW cov: 11928 ft: 14577 corp: 11/30b lim: 5 exec/s: 22 rss: 230Mb L: 2/5 MS: 1 InsertByte- 00:09:04.094 [2024-04-18 11:44:54.487262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.487299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.487356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.487373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.487450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.487467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.487519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.487536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.527264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.527296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.527352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.527368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.527428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.527444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.527503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.527518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.094 #24 NEW cov: 11928 ft: 14581 corp: 12/34b lim: 5 exec/s: 24 rss: 231Mb L: 4/5 MS: 1 EraseBytes- 00:09:04.094 [2024-04-18 11:44:54.588576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.588619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.094 [2024-04-18 11:44:54.628546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.094 [2024-04-18 11:44:54.628577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.353 #26 NEW cov: 11928 ft: 14607 corp: 13/35b lim: 5 exec/s: 26 rss: 232Mb L: 1/5 MS: 1 ChangeByte- 00:09:04.353 [2024-04-18 11:44:54.683394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.353 [2024-04-18 11:44:54.683439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.353 [2024-04-18 11:44:54.733395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.353 [2024-04-18 11:44:54.733447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.354 #28 NEW cov: 11928 ft: 14689 corp: 14/36b lim: 5 exec/s: 28 rss: 234Mb L: 1/5 MS: 1 ChangeByte- 00:09:04.354 [2024-04-18 11:44:54.786690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.354 [2024-04-18 11:44:54.786736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.354 [2024-04-18 11:44:54.786828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.354 [2024-04-18 11:44:54.786853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.354 [2024-04-18 11:44:54.836743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.354 [2024-04-18 11:44:54.836774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.354 [2024-04-18 11:44:54.836831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.354 [2024-04-18 11:44:54.836847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.354 #30 NEW cov: 11928 ft: 14718 corp: 15/38b lim: 5 exec/s: 30 rss: 235Mb L: 2/5 MS: 1 InsertByte- 00:09:04.354 [2024-04-18 11:44:54.894699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.354 [2024-04-18 11:44:54.894737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:54.944788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:54.944821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.613 #32 NEW cov: 11928 ft: 14741 corp: 16/39b lim: 5 exec/s: 32 rss: 236Mb L: 1/5 MS: 1 ShuffleBytes- 00:09:04.613 [2024-04-18 11:44:54.994096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:54.994130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:54.994192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:54.994209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:54.994265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:54.994281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:54.994362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:54.994379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:55.044303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:55.044335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:55.044391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:55.044407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:55.044484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:55.044502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:55.044562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:55.044578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:04.613 #34 NEW cov: 11928 ft: 14815 corp: 17/43b lim: 5 exec/s: 34 rss: 239Mb L: 4/5 MS: 1 ChangeByte- 00:09:04.613 [2024-04-18 11:44:55.105551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:55.105588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:04.613 [2024-04-18 11:44:55.105652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:04.613 [2024-04-18 11:44:55.105672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.182 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:05.182 [2024-04-18 11:44:55.467640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.182 [2024-04-18 11:44:55.467710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:05.182 [2024-04-18 11:44:55.467787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:05.182 [2024-04-18 11:44:55.467817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:05.183 #36 NEW cov: 11945 ft: 14848 corp: 18/45b lim: 5 exec/s: 18 rss: 240Mb L: 2/5 MS: 1 CopyPart- 00:09:05.183 #36 DONE cov: 11945 ft: 14848 corp: 18/45b lim: 5 exec/s: 18 rss: 240Mb 00:09:05.183 Done 36 runs in 2 second(s) 00:09:05.442 11:44:55 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:09:05.442 11:44:55 -- ../common.sh@72 -- # (( i++ )) 00:09:05.442 11:44:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:05.442 11:44:55 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:09:05.442 11:44:55 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:09:05.442 11:44:55 -- nvmf/run.sh@24 -- # local timen=1 00:09:05.442 11:44:55 -- nvmf/run.sh@25 -- # local core=0x1 00:09:05.442 11:44:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:05.442 11:44:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:09:05.442 11:44:55 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:05.442 11:44:55 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:05.442 11:44:55 -- nvmf/run.sh@34 -- # printf %02d 10 00:09:05.442 11:44:55 -- nvmf/run.sh@34 -- # port=4410 00:09:05.442 11:44:55 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:05.442 11:44:55 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:09:05.442 11:44:55 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:05.442 11:44:55 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:05.442 11:44:55 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:05.442 11:44:55 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:09:05.701 [2024-04-18 11:44:55.996152] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:05.701 [2024-04-18 11:44:55.996246] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid369800 ] 00:09:05.701 EAL: No free 2048 kB hugepages reported on node 1 00:09:05.701 [2024-04-18 11:44:56.248707] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.961 [2024-04-18 11:44:56.400616] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.221 [2024-04-18 11:44:56.646384] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:06.221 [2024-04-18 11:44:56.662613] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:09:06.221 INFO: Running with entropic power schedule (0xFF, 100). 00:09:06.221 INFO: Seed: 3820241539 00:09:06.221 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:06.221 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:06.221 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:09:06.221 INFO: A corpus is not provided, starting from an empty corpus 00:09:06.221 #2 INITED exec/s: 0 rss: 200Mb 00:09:06.221 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:06.221 This may also happen if the target rejected all inputs we tried so far 00:09:06.221 [2024-04-18 11:44:56.712402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.221 [2024-04-18 11:44:56.712444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.221 [2024-04-18 11:44:56.712497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.221 [2024-04-18 11:44:56.712514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.790 NEW_FUNC[1/669]: 0x557840 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:09:06.790 NEW_FUNC[2/669]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:06.790 [2024-04-18 11:44:57.086614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.790 [2024-04-18 11:44:57.086683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.790 [2024-04-18 11:44:57.086794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.790 [2024-04-18 11:44:57.086821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.790 #10 NEW cov: 11736 ft: 11709 corp: 2/18b lim: 40 exec/s: 0 rss: 216Mb L: 17/17 MS: 2 InsertByte-InsertRepeatedBytes- 00:09:06.790 [2024-04-18 11:44:57.167538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61616161 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.790 [2024-04-18 11:44:57.167577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.791 [2024-04-18 11:44:57.167686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:61616161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.167705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.791 [2024-04-18 11:44:57.167803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.167824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.791 NEW_FUNC[1/1]: 0x1b5b780 in nvme_tcp_read_data /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/nvme_tcp.h:412 00:09:06.791 [2024-04-18 11:44:57.237762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61616161 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.237795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.791 [2024-04-18 11:44:57.237884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:61616161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.237903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.791 [2024-04-18 11:44:57.237995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.238014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:06.791 #17 NEW cov: 11761 ft: 12456 corp: 3/46b lim: 40 exec/s: 0 rss: 218Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:09:06.791 [2024-04-18 11:44:57.309989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.310021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:06.791 [2024-04-18 11:44:57.310108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16166161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.310126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:06.791 [2024-04-18 11:44:57.310220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:06.791 [2024-04-18 11:44:57.310239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.050 [2024-04-18 11:44:57.370288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.050 [2024-04-18 11:44:57.370318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.050 [2024-04-18 11:44:57.370425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16166161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.370447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 [2024-04-18 11:44:57.370538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.370559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.051 #19 NEW cov: 11773 ft: 12739 corp: 4/74b lim: 40 exec/s: 0 rss: 220Mb L: 28/28 MS: 1 CrossOver- 00:09:07.051 [2024-04-18 11:44:57.441238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.441276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.051 [2024-04-18 11:44:57.441386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.441408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 [2024-04-18 11:44:57.491552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.491584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.051 [2024-04-18 11:44:57.491683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.491702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.051 #21 NEW cov: 11859 ft: 12952 corp: 5/91b lim: 40 exec/s: 0 rss: 221Mb L: 17/28 MS: 1 EraseBytes- 00:09:07.051 [2024-04-18 11:44:57.563267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61161640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.563302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.051 [2024-04-18 11:44:57.563403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.051 [2024-04-18 11:44:57.563427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.310 [2024-04-18 11:44:57.623408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61161640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.623446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.310 [2024-04-18 11:44:57.623551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.623569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.310 #23 NEW cov: 11859 ft: 13150 corp: 6/109b lim: 40 exec/s: 0 rss: 223Mb L: 18/28 MS: 1 InsertByte- 00:09:07.310 [2024-04-18 11:44:57.683142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a161661 cdw11:61611616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.683178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.310 [2024-04-18 11:44:57.733248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a161661 cdw11:61611616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.733286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.310 #27 NEW cov: 11859 ft: 13542 corp: 7/120b lim: 40 exec/s: 27 rss: 224Mb L: 11/28 MS: 3 CopyPart-CrossOver-CrossOver- 00:09:07.310 [2024-04-18 11:44:57.800236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61161640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.800269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.310 [2024-04-18 11:44:57.800358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.800379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.310 [2024-04-18 11:44:57.860421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:61161640 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.860454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.310 [2024-04-18 11:44:57.860556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.310 [2024-04-18 11:44:57.860575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.570 #29 NEW cov: 11859 ft: 13641 corp: 8/138b lim: 40 exec/s: 29 rss: 226Mb L: 18/28 MS: 1 ShuffleBytes- 00:09:07.570 [2024-04-18 11:44:57.922986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:57.923018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.570 [2024-04-18 11:44:57.923120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16111616 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:57.923140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.570 [2024-04-18 11:44:57.973175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:57.973206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.570 [2024-04-18 11:44:57.973299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16111616 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:57.973317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.570 #31 NEW cov: 11859 ft: 13903 corp: 9/155b lim: 40 exec/s: 31 rss: 227Mb L: 17/28 MS: 1 ChangeBinInt- 00:09:07.570 [2024-04-18 11:44:58.034312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:58.034346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.570 [2024-04-18 11:44:58.034450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16166161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:58.034471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.570 [2024-04-18 11:44:58.094718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:58.094754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.570 [2024-04-18 11:44:58.094861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16166161 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.570 [2024-04-18 11:44:58.094882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.830 #33 NEW cov: 11859 ft: 13969 corp: 10/172b lim: 40 exec/s: 33 rss: 228Mb L: 17/28 MS: 1 EraseBytes- 00:09:07.830 [2024-04-18 11:44:58.168386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.168433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.168532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161661 cdw11:61611616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.168553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.168657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.168677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.218575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:1616160a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.218610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.218712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161661 cdw11:61611616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.218732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.218829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.218850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.830 #35 NEW cov: 11859 ft: 14009 corp: 11/201b lim: 40 exec/s: 35 rss: 230Mb L: 29/29 MS: 1 CrossOver- 00:09:07.830 [2024-04-18 11:44:58.293129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.293166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.293271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.293291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.293384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:61616116 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.293404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.343473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.343523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.343632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.830 [2024-04-18 11:44:58.343653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:07.830 [2024-04-18 11:44:58.343741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:61616116 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:07.831 [2024-04-18 11:44:58.343761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.089 #37 NEW cov: 11859 ft: 14114 corp: 12/229b lim: 40 exec/s: 37 rss: 230Mb L: 28/29 MS: 1 CopyPart- 00:09:08.089 [2024-04-18 11:44:58.420037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.420074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.420168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.420188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.420285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:61161661 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.420305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.420404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:16161616 cdw11:16616116 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.420429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.420523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:16161616 cdw11:16160a2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.420543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:08.089 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:08.089 [2024-04-18 11:44:58.480232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.480265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.480372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.480392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.480475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:61161661 cdw11:61161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.480494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.480587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:16161616 cdw11:16616116 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.480610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.480698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:16161616 cdw11:16160a2c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.480716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:08.089 #39 NEW cov: 11876 ft: 14682 corp: 13/269b lim: 40 exec/s: 39 rss: 232Mb L: 40/40 MS: 1 CopyPart- 00:09:08.089 [2024-04-18 11:44:58.551430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.551464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.551556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.551582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.601620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.601650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.089 [2024-04-18 11:44:58.601756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.089 [2024-04-18 11:44:58.601776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.089 #41 NEW cov: 11876 ft: 14744 corp: 14/289b lim: 40 exec/s: 41 rss: 234Mb L: 20/40 MS: 1 EraseBytes- 00:09:08.348 [2024-04-18 11:44:58.663854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.348 [2024-04-18 11:44:58.663889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.348 [2024-04-18 11:44:58.663981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:167e1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.348 [2024-04-18 11:44:58.664000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.348 [2024-04-18 11:44:58.724136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:16166161 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.348 [2024-04-18 11:44:58.724169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:08.348 [2024-04-18 11:44:58.724261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:167e1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:08.348 [2024-04-18 11:44:58.724281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:08.348 #43 NEW cov: 11876 ft: 14768 corp: 15/309b lim: 40 exec/s: 21 rss: 235Mb L: 20/40 MS: 1 ChangeByte- 00:09:08.348 #43 DONE cov: 11876 ft: 14768 corp: 15/309b lim: 40 exec/s: 21 rss: 235Mb 00:09:08.348 Done 43 runs in 2 second(s) 00:09:08.917 11:44:59 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:09:08.917 11:44:59 -- ../common.sh@72 -- # (( i++ )) 00:09:08.917 11:44:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:08.917 11:44:59 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:09:08.917 11:44:59 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:09:08.917 11:44:59 -- nvmf/run.sh@24 -- # local timen=1 00:09:08.917 11:44:59 -- nvmf/run.sh@25 -- # local core=0x1 00:09:08.917 11:44:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:08.917 11:44:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:09:08.917 11:44:59 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:08.917 11:44:59 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:08.917 11:44:59 -- nvmf/run.sh@34 -- # printf %02d 11 00:09:08.917 11:44:59 -- nvmf/run.sh@34 -- # port=4411 00:09:08.917 11:44:59 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:08.917 11:44:59 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:09:08.917 11:44:59 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:08.917 11:44:59 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:08.917 11:44:59 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:08.917 11:44:59 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:09:08.917 [2024-04-18 11:44:59.234977] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:08.917 [2024-04-18 11:44:59.235089] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid370187 ] 00:09:08.917 EAL: No free 2048 kB hugepages reported on node 1 00:09:09.175 [2024-04-18 11:44:59.503271] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.175 [2024-04-18 11:44:59.657473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.435 [2024-04-18 11:44:59.906993] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:09.435 [2024-04-18 11:44:59.923229] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:09:09.435 INFO: Running with entropic power schedule (0xFF, 100). 00:09:09.435 INFO: Seed: 2787234707 00:09:09.435 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:09.435 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:09.435 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:09:09.435 INFO: A corpus is not provided, starting from an empty corpus 00:09:09.435 #2 INITED exec/s: 0 rss: 200Mb 00:09:09.435 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:09.435 This may also happen if the target rejected all inputs we tried so far 00:09:09.694 [2024-04-18 11:45:00.001922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.694 [2024-04-18 11:45:00.001971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.694 [2024-04-18 11:45:00.002086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.694 [2024-04-18 11:45:00.002106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.694 [2024-04-18 11:45:00.002207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.694 [2024-04-18 11:45:00.002226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.953 NEW_FUNC[1/671]: 0x559960 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:09:09.953 NEW_FUNC[2/671]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:09.953 [2024-04-18 11:45:00.382405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.953 [2024-04-18 11:45:00.382470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.953 [2024-04-18 11:45:00.382583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.953 [2024-04-18 11:45:00.382604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:09.953 [2024-04-18 11:45:00.382694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.953 [2024-04-18 11:45:00.382713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:09.953 #10 NEW cov: 11749 ft: 11719 corp: 2/27b lim: 40 exec/s: 0 rss: 217Mb L: 26/26 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:09:09.953 [2024-04-18 11:45:00.463641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.953 [2024-04-18 11:45:00.463680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:09.953 [2024-04-18 11:45:00.463792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:09.953 [2024-04-18 11:45:00.463812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.212 [2024-04-18 11:45:00.514141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.212 [2024-04-18 11:45:00.514173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.212 [2024-04-18 11:45:00.514286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.212 [2024-04-18 11:45:00.514306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.212 #16 NEW cov: 11773 ft: 12421 corp: 3/49b lim: 40 exec/s: 0 rss: 218Mb L: 22/26 MS: 5 ShuffleBytes-ShuffleBytes-CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:09:10.212 [2024-04-18 11:45:00.581641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.212 [2024-04-18 11:45:00.581676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.213 [2024-04-18 11:45:00.581781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.213 [2024-04-18 11:45:00.581801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.213 [2024-04-18 11:45:00.641872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.213 [2024-04-18 11:45:00.641904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.213 [2024-04-18 11:45:00.642014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.213 [2024-04-18 11:45:00.642036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.213 #18 NEW cov: 11785 ft: 12861 corp: 4/67b lim: 40 exec/s: 0 rss: 220Mb L: 18/26 MS: 1 EraseBytes- 00:09:10.213 [2024-04-18 11:45:00.714021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.213 [2024-04-18 11:45:00.714060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.213 [2024-04-18 11:45:00.714165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.213 [2024-04-18 11:45:00.714185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.213 [2024-04-18 11:45:00.714287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.213 [2024-04-18 11:45:00.714305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.213 [2024-04-18 11:45:00.714397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.213 [2024-04-18 11:45:00.714421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.774233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.774270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.774367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.774386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.774479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.774498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.774587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.774606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.472 #20 NEW cov: 11871 ft: 13515 corp: 5/102b lim: 40 exec/s: 0 rss: 222Mb L: 35/35 MS: 1 CopyPart- 00:09:10.472 [2024-04-18 11:45:00.850015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f6e76 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.850056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.850164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6d663f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.850185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.910264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f6e76 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.910301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.910409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6d663f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.910450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.472 #22 NEW cov: 11871 ft: 13612 corp: 6/120b lim: 40 exec/s: 0 rss: 223Mb L: 18/35 MS: 1 CMP- DE: "nvmf"- 00:09:10.472 [2024-04-18 11:45:00.983258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.983307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.983410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.983442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.472 [2024-04-18 11:45:00.983555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.472 [2024-04-18 11:45:00.983581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.043734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.043769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.043864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.043885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.043972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.043990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.732 #24 NEW cov: 11871 ft: 13655 corp: 7/146b lim: 40 exec/s: 24 rss: 224Mb L: 26/35 MS: 1 ShuffleBytes- 00:09:10.732 [2024-04-18 11:45:01.117437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.117471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.117575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:003f3f3f cdw11:3f3f3f6e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.117595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.117688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:766d663f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.117707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.177755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.177785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.177894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:003f3f3f cdw11:3f3f3f6e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.177928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.178022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:766d663f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.178040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.732 #26 NEW cov: 11871 ft: 13704 corp: 8/173b lim: 40 exec/s: 26 rss: 226Mb L: 27/35 MS: 1 InsertRepeatedBytes- 00:09:10.732 [2024-04-18 11:45:01.253201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.253234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.253336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.253357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.253455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.253474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.732 [2024-04-18 11:45:01.253580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.732 [2024-04-18 11:45:01.253599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.303462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.303495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.303591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.303609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.303697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.303717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.303810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.303829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:10.991 #28 NEW cov: 11871 ft: 13764 corp: 9/212b lim: 40 exec/s: 28 rss: 227Mb L: 39/39 MS: 1 CopyPart- 00:09:10.991 [2024-04-18 11:45:01.375734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.375766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.375867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:990e9999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.375890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.375985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.376001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.425936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.425968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.426078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:990e9999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.426096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.426184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.426205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.991 #30 NEW cov: 11871 ft: 13779 corp: 10/238b lim: 40 exec/s: 30 rss: 228Mb L: 26/39 MS: 1 ChangeByte- 00:09:10.991 [2024-04-18 11:45:01.498339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.498373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.498492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.498512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.498607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.498625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:10.991 [2024-04-18 11:45:01.498723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:10.991 [2024-04-18 11:45:01.498742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.250 [2024-04-18 11:45:01.548694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.250 [2024-04-18 11:45:01.548727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.548828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.548846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.548935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.548953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.549050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.549071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.251 #37 NEW cov: 11871 ft: 13894 corp: 11/272b lim: 40 exec/s: 37 rss: 230Mb L: 34/39 MS: 1 InsertRepeatedBytes- 00:09:11.251 [2024-04-18 11:45:01.609481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f6e76 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.609514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.609628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6d663f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.609646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.609735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0ba9a9a9 cdw11:a9a9a9a9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.609754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.609854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.609872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.659863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f6e76 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.659896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.660012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6d663f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.660032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.660132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:0ba9a9a9 cdw11:a9a9a9a9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.660152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.660242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.660260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.251 #39 NEW cov: 11871 ft: 13972 corp: 12/309b lim: 40 exec/s: 39 rss: 231Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:09:11.251 [2024-04-18 11:45:01.728876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.728912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.729013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff9999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.729038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.729128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99990e99 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.729146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.729245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.729264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.789244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.789277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.789385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff9999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.789405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.789503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:99990e99 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.789521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.251 [2024-04-18 11:45:01.789623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.251 [2024-04-18 11:45:01.789641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.511 #41 NEW cov: 11871 ft: 14079 corp: 13/344b lim: 40 exec/s: 41 rss: 232Mb L: 35/39 MS: 1 InsertRepeatedBytes- 00:09:11.511 [2024-04-18 11:45:01.862581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.511 [2024-04-18 11:45:01.862619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.511 [2024-04-18 11:45:01.862719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff99 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.511 [2024-04-18 11:45:01.862741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.511 [2024-04-18 11:45:01.862842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:9999990e cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.511 [2024-04-18 11:45:01.862863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.511 [2024-04-18 11:45:01.862959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.511 [2024-04-18 11:45:01.862980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.511 [2024-04-18 11:45:01.922876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a999999 cdw11:99ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.511 [2024-04-18 11:45:01.922908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.511 [2024-04-18 11:45:01.923019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff99 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.511 [2024-04-18 11:45:01.923043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:01.923144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:9999990e cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:01.923163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:01.923261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:01.923280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.512 #43 NEW cov: 11871 ft: 14144 corp: 14/380b lim: 40 exec/s: 43 rss: 233Mb L: 36/39 MS: 1 CrossOver- 00:09:11.512 [2024-04-18 11:45:01.985927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:01.985960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:01.986057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:01.986076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:01.986168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:01.986185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:01.986276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3f3f3f3f cdw11:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:01.986294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:02.046146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:02.046177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:02.046296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:02.046316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:02.046412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:02.046435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:11.512 [2024-04-18 11:45:02.046534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:3f3f3f3f cdw11:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:11.512 [2024-04-18 11:45:02.046554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:11.772 #45 NEW cov: 11871 ft: 14223 corp: 15/418b lim: 40 exec/s: 22 rss: 235Mb L: 38/39 MS: 1 InsertRepeatedBytes- 00:09:11.772 #45 DONE cov: 11871 ft: 14223 corp: 15/418b lim: 40 exec/s: 22 rss: 236Mb 00:09:11.772 ###### Recommended dictionary. ###### 00:09:11.772 "nvmf" # Uses: 1 00:09:11.772 ###### End of recommended dictionary. ###### 00:09:11.772 Done 45 runs in 2 second(s) 00:09:12.031 11:45:02 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:09:12.031 11:45:02 -- ../common.sh@72 -- # (( i++ )) 00:09:12.031 11:45:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:12.031 11:45:02 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:09:12.031 11:45:02 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:09:12.031 11:45:02 -- nvmf/run.sh@24 -- # local timen=1 00:09:12.031 11:45:02 -- nvmf/run.sh@25 -- # local core=0x1 00:09:12.031 11:45:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:12.031 11:45:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:09:12.031 11:45:02 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:12.031 11:45:02 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:12.031 11:45:02 -- nvmf/run.sh@34 -- # printf %02d 12 00:09:12.031 11:45:02 -- nvmf/run.sh@34 -- # port=4412 00:09:12.031 11:45:02 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:12.031 11:45:02 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:09:12.031 11:45:02 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:12.031 11:45:02 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:12.031 11:45:02 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:12.031 11:45:02 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:09:12.291 [2024-04-18 11:45:02.584924] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:12.291 [2024-04-18 11:45:02.585024] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid370848 ] 00:09:12.291 EAL: No free 2048 kB hugepages reported on node 1 00:09:12.553 [2024-04-18 11:45:02.855141] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.553 [2024-04-18 11:45:03.010629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.830 [2024-04-18 11:45:03.266478] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:12.830 [2024-04-18 11:45:03.282715] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:09:12.830 INFO: Running with entropic power schedule (0xFF, 100). 00:09:12.830 INFO: Seed: 1850290683 00:09:12.830 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:12.830 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:12.830 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:09:12.830 INFO: A corpus is not provided, starting from an empty corpus 00:09:12.830 #2 INITED exec/s: 0 rss: 200Mb 00:09:12.830 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:12.830 This may also happen if the target rejected all inputs we tried so far 00:09:12.830 [2024-04-18 11:45:03.332349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:12.830 [2024-04-18 11:45:03.332389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.398 NEW_FUNC[1/671]: 0x55ba80 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:09:13.398 NEW_FUNC[2/671]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:13.398 [2024-04-18 11:45:03.693242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.398 [2024-04-18 11:45:03.693298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.398 #19 NEW cov: 11747 ft: 11720 corp: 2/9b lim: 40 exec/s: 0 rss: 217Mb L: 8/8 MS: 4 InsertByte-InsertByte-InsertByte-CMP- DE: "\377\377\377\377"- 00:09:13.398 [2024-04-18 11:45:03.757702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:babababa cdw11:babababa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.398 [2024-04-18 11:45:03.757765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.398 [2024-04-18 11:45:03.757822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:bababaff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.398 [2024-04-18 11:45:03.757841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.398 [2024-04-18 11:45:03.807848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:babababa cdw11:babababa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.398 [2024-04-18 11:45:03.807885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.398 [2024-04-18 11:45:03.807944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:bababaff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.398 [2024-04-18 11:45:03.807962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.398 #33 NEW cov: 11771 ft: 12716 corp: 3/28b lim: 40 exec/s: 0 rss: 218Mb L: 19/19 MS: 3 EraseBytes-PersAutoDict-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:09:13.398 [2024-04-18 11:45:03.868461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.398 [2024-04-18 11:45:03.868498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.398 [2024-04-18 11:45:03.908526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.398 [2024-04-18 11:45:03.908560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.398 #48 NEW cov: 11783 ft: 13105 corp: 4/39b lim: 40 exec/s: 0 rss: 220Mb L: 11/19 MS: 4 ShuffleBytes-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:09:13.658 [2024-04-18 11:45:03.958069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:61800000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:03.958110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.658 [2024-04-18 11:45:03.958169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:456c0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:03.958188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.658 [2024-04-18 11:45:04.008177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:61800000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:04.008218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.658 [2024-04-18 11:45:04.008277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:456c0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:04.008294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.658 #50 NEW cov: 11869 ft: 13395 corp: 5/58b lim: 40 exec/s: 0 rss: 222Mb L: 19/19 MS: 1 CMP- DE: "\000\000a\200\000\000El"- 00:09:13.658 [2024-04-18 11:45:04.067557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:61ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:04.067593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.658 [2024-04-18 11:45:04.067651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0b6c0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:04.067672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.658 [2024-04-18 11:45:04.117747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:61ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:04.117787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.658 [2024-04-18 11:45:04.117847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0b6c0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:04.117869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:13.658 #52 NEW cov: 11869 ft: 13446 corp: 6/77b lim: 40 exec/s: 0 rss: 223Mb L: 19/19 MS: 1 CMP- DE: "\377\377\377\013"- 00:09:13.658 [2024-04-18 11:45:04.176018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffbf cdw11:ffffbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.658 [2024-04-18 11:45:04.176057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.917 [2024-04-18 11:45:04.216137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffbf cdw11:ffffbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.917 [2024-04-18 11:45:04.216173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.917 #54 NEW cov: 11869 ft: 13522 corp: 7/85b lim: 40 exec/s: 0 rss: 225Mb L: 8/19 MS: 1 ChangeBit- 00:09:13.917 [2024-04-18 11:45:04.275039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffbf cdw11:ffffbb3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.917 [2024-04-18 11:45:04.275075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.917 [2024-04-18 11:45:04.325194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffbf cdw11:ffffbb3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.917 [2024-04-18 11:45:04.325232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.917 #61 NEW cov: 11869 ft: 13610 corp: 8/93b lim: 40 exec/s: 61 rss: 226Mb L: 8/19 MS: 1 ChangeByte- 00:09:13.917 [2024-04-18 11:45:04.374227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:fffbbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.917 [2024-04-18 11:45:04.374261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.917 [2024-04-18 11:45:04.414269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:fffbbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:13.917 [2024-04-18 11:45:04.414301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:13.917 #63 NEW cov: 11869 ft: 13777 corp: 9/101b lim: 40 exec/s: 63 rss: 227Mb L: 8/19 MS: 1 ChangeBit- 00:09:14.176 [2024-04-18 11:45:04.470687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffbb0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.176 [2024-04-18 11:45:04.470724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.470798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:23adffff cdw11:ffffbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.470815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.510807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffbb0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.510840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.510898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:23adffff cdw11:ffffbbad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.510915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.177 #65 NEW cov: 11869 ft: 13786 corp: 10/117b lim: 40 exec/s: 65 rss: 229Mb L: 16/19 MS: 1 CrossOver- 00:09:14.177 [2024-04-18 11:45:04.567254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffbb0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.567288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.567362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:23adffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.567379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.617419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffbb0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.617451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.617525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:23adffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.617542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.177 #67 NEW cov: 11869 ft: 13856 corp: 11/137b lim: 40 exec/s: 67 rss: 230Mb L: 20/20 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:09:14.177 [2024-04-18 11:45:04.671265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.671300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.671359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffbb0aff cdw11:ffffff23 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.671376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.721427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.721460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.177 [2024-04-18 11:45:04.721518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffbb0aff cdw11:ffffff23 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.177 [2024-04-18 11:45:04.721535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.436 #69 NEW cov: 11869 ft: 13957 corp: 12/160b lim: 40 exec/s: 69 rss: 232Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:09:14.436 [2024-04-18 11:45:04.779952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:fffb0800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.436 [2024-04-18 11:45:04.779988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.436 [2024-04-18 11:45:04.830089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:fffb0800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.436 [2024-04-18 11:45:04.830122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.436 #71 NEW cov: 11869 ft: 14017 corp: 13/168b lim: 40 exec/s: 71 rss: 233Mb L: 8/23 MS: 1 ChangeBinInt- 00:09:14.436 [2024-04-18 11:45:04.890803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffbb230a cdw11:fffbffad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.436 [2024-04-18 11:45:04.890841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.436 [2024-04-18 11:45:04.930933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffbb230a cdw11:fffbffad SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.436 [2024-04-18 11:45:04.930967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.436 #73 NEW cov: 11869 ft: 14036 corp: 14/176b lim: 40 exec/s: 73 rss: 234Mb L: 8/23 MS: 1 ShuffleBytes- 00:09:14.695 [2024-04-18 11:45:04.991611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:babababa cdw11:bababa00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.695 [2024-04-18 11:45:04.991648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.695 [2024-04-18 11:45:04.991708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00618000 cdw11:00456cba SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.695 [2024-04-18 11:45:04.991726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.695 [2024-04-18 11:45:04.991782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:bababaff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.695 [2024-04-18 11:45:04.991799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.695 [2024-04-18 11:45:05.041773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:babababa cdw11:bababa00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.695 [2024-04-18 11:45:05.041807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.695 [2024-04-18 11:45:05.041866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00618000 cdw11:00456cba SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.695 [2024-04-18 11:45:05.041884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:14.696 [2024-04-18 11:45:05.041939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:bababaff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.696 [2024-04-18 11:45:05.041955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:14.696 #75 NEW cov: 11869 ft: 14270 corp: 15/203b lim: 40 exec/s: 75 rss: 235Mb L: 27/27 MS: 1 PersAutoDict- DE: "\000\000a\200\000\000El"- 00:09:14.696 [2024-04-18 11:45:05.102347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.696 [2024-04-18 11:45:05.102386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.696 [2024-04-18 11:45:05.142409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.696 [2024-04-18 11:45:05.142450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.696 #77 NEW cov: 11869 ft: 14274 corp: 16/214b lim: 40 exec/s: 77 rss: 237Mb L: 11/27 MS: 1 ShuffleBytes- 00:09:14.696 [2024-04-18 11:45:05.203024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.696 [2024-04-18 11:45:05.203059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.955 [2024-04-18 11:45:05.253168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.955 [2024-04-18 11:45:05.253202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.955 #79 NEW cov: 11869 ft: 14283 corp: 17/225b lim: 40 exec/s: 79 rss: 238Mb L: 11/27 MS: 1 ChangeByte- 00:09:14.955 [2024-04-18 11:45:05.308704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:fffb08ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.955 [2024-04-18 11:45:05.308739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.955 [2024-04-18 11:45:05.358838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a23ffff cdw11:fffb08ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:14.955 [2024-04-18 11:45:05.358871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:14.955 #81 NEW cov: 11869 ft: 14323 corp: 18/237b lim: 40 exec/s: 40 rss: 239Mb L: 12/27 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:09:14.955 #81 DONE cov: 11869 ft: 14323 corp: 18/237b lim: 40 exec/s: 40 rss: 240Mb 00:09:14.955 ###### Recommended dictionary. ###### 00:09:14.955 "\377\377\377\377" # Uses: 4 00:09:14.955 "\000\000a\200\000\000El" # Uses: 1 00:09:14.955 "\377\377\377\013" # Uses: 0 00:09:14.955 ###### End of recommended dictionary. ###### 00:09:14.955 Done 81 runs in 2 second(s) 00:09:15.524 11:45:05 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:09:15.524 11:45:05 -- ../common.sh@72 -- # (( i++ )) 00:09:15.524 11:45:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:15.524 11:45:05 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:09:15.524 11:45:05 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:09:15.524 11:45:05 -- nvmf/run.sh@24 -- # local timen=1 00:09:15.524 11:45:05 -- nvmf/run.sh@25 -- # local core=0x1 00:09:15.524 11:45:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:15.524 11:45:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:09:15.524 11:45:05 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:15.524 11:45:05 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:15.524 11:45:05 -- nvmf/run.sh@34 -- # printf %02d 13 00:09:15.524 11:45:05 -- nvmf/run.sh@34 -- # port=4413 00:09:15.524 11:45:05 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:15.524 11:45:05 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:09:15.524 11:45:05 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:15.524 11:45:05 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:15.524 11:45:05 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:15.524 11:45:05 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:09:15.524 [2024-04-18 11:45:05.866130] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:15.524 [2024-04-18 11:45:05.866234] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid371621 ] 00:09:15.524 EAL: No free 2048 kB hugepages reported on node 1 00:09:15.783 [2024-04-18 11:45:06.116701] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:15.783 [2024-04-18 11:45:06.270212] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.042 [2024-04-18 11:45:06.533818] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:16.042 [2024-04-18 11:45:06.550031] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:09:16.042 INFO: Running with entropic power schedule (0xFF, 100). 00:09:16.042 INFO: Seed: 823298560 00:09:16.042 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:16.042 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:16.042 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:09:16.042 INFO: A corpus is not provided, starting from an empty corpus 00:09:16.042 #2 INITED exec/s: 0 rss: 199Mb 00:09:16.042 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:16.042 This may also happen if the target rejected all inputs we tried so far 00:09:16.301 [2024-04-18 11:45:06.595914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.301 [2024-04-18 11:45:06.595953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.301 [2024-04-18 11:45:06.596014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.301 [2024-04-18 11:45:06.596032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.560 NEW_FUNC[1/670]: 0x55d9b0 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:09:16.560 NEW_FUNC[2/670]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:16.560 [2024-04-18 11:45:06.948182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:06.948236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:06.948296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:06.948313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.560 #21 NEW cov: 11735 ft: 11708 corp: 2/22b lim: 40 exec/s: 0 rss: 216Mb L: 21/21 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:09:16.560 [2024-04-18 11:45:07.001132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.001171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.001229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.001249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.001306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.001322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.001380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.001398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.041178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.041213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.041274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.041291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.041351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.041367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.041432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.041448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.560 #24 NEW cov: 11759 ft: 12657 corp: 3/57b lim: 40 exec/s: 0 rss: 219Mb L: 35/35 MS: 2 ChangeByte-InsertRepeatedBytes- 00:09:16.560 [2024-04-18 11:45:07.099171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.099205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.560 [2024-04-18 11:45:07.099263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.560 [2024-04-18 11:45:07.099280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.561 [2024-04-18 11:45:07.099338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.561 [2024-04-18 11:45:07.099354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.561 [2024-04-18 11:45:07.099410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.561 [2024-04-18 11:45:07.099430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.149293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.149329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.149388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.149405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.149466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.149482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.149539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.149555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.820 #26 NEW cov: 11771 ft: 12883 corp: 4/96b lim: 40 exec/s: 0 rss: 220Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:09:16.820 [2024-04-18 11:45:07.196008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.196041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.196104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdffff cdw11:fffffdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.196121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.196178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.196194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.196250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.196266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.236100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.236132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.236190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdffff cdw11:fffffdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.236207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.236267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.236283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.236340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.236356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.820 #28 NEW cov: 11857 ft: 13458 corp: 5/135b lim: 40 exec/s: 0 rss: 221Mb L: 39/39 MS: 1 CrossOver- 00:09:16.820 [2024-04-18 11:45:07.296115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.296155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.296217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.296234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.296291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:acacacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.296307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.296363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.296379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.346206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.820 [2024-04-18 11:45:07.346237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:16.820 [2024-04-18 11:45:07.346316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.821 [2024-04-18 11:45:07.346333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:16.821 [2024-04-18 11:45:07.346392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:acacacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.821 [2024-04-18 11:45:07.346408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:16.821 [2024-04-18 11:45:07.346476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:16.821 [2024-04-18 11:45:07.346493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:16.821 #30 NEW cov: 11857 ft: 13628 corp: 6/170b lim: 40 exec/s: 0 rss: 223Mb L: 35/39 MS: 1 InsertRepeatedBytes- 00:09:17.080 [2024-04-18 11:45:07.392678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.392713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.080 [2024-04-18 11:45:07.392772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.392790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.080 [2024-04-18 11:45:07.392845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:acacacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.392861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.080 [2024-04-18 11:45:07.392919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.392938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.080 [2024-04-18 11:45:07.442767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.442800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.080 [2024-04-18 11:45:07.442859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.442876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.080 [2024-04-18 11:45:07.442930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:acacacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.442947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.080 [2024-04-18 11:45:07.443003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.080 [2024-04-18 11:45:07.443020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.080 #32 NEW cov: 11857 ft: 13703 corp: 7/205b lim: 40 exec/s: 0 rss: 225Mb L: 35/39 MS: 1 ShuffleBytes- 00:09:17.080 [2024-04-18 11:45:07.504441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fd0000 cdw11:000000fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.504477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.504540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.504558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.504613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.504630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.504687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.504704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.504765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.504787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.544598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fd0000 cdw11:000000fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.544633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.544693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.544714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.544772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.544790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.544849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.544865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.544921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.544937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:17.081 #34 NEW cov: 11857 ft: 13778 corp: 8/245b lim: 40 exec/s: 0 rss: 226Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:09:17.081 [2024-04-18 11:45:07.591154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fd7dfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.591188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.591248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.591265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.591323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.591340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.591397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.591419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.631289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fd7dfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.631320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.081 [2024-04-18 11:45:07.631397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.081 [2024-04-18 11:45:07.631423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.631483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.631501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.631562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.631581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.341 #36 NEW cov: 11857 ft: 13818 corp: 9/280b lim: 40 exec/s: 36 rss: 227Mb L: 35/40 MS: 1 ChangeBit- 00:09:17.341 [2024-04-18 11:45:07.689226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.689261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.689322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdff cdw11:fffffffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.689339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.689396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.689412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.689475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.689491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.689548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.689564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.739423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.739457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.739516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdff cdw11:fffffffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.739533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.739589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.739605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.739661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.739676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.739733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.739748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:17.341 #38 NEW cov: 11857 ft: 13887 corp: 10/320b lim: 40 exec/s: 38 rss: 229Mb L: 40/40 MS: 1 CrossOver- 00:09:17.341 [2024-04-18 11:45:07.786059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.786093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.786174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.786192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.826183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.826215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.826279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.826296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.341 #40 NEW cov: 11857 ft: 14006 corp: 11/341b lim: 40 exec/s: 40 rss: 230Mb L: 21/40 MS: 1 ChangeBinInt- 00:09:17.341 [2024-04-18 11:45:07.873055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.873088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.873148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.873165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.873221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.873238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.873298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fd29fdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.873314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.341 [2024-04-18 11:45:07.873371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.341 [2024-04-18 11:45:07.873387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.923199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.923233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.923292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.923309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.923375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.923391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.923455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fd29fdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.923474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.923532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.923548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:17.601 #42 NEW cov: 11857 ft: 14043 corp: 12/381b lim: 40 exec/s: 42 rss: 231Mb L: 40/40 MS: 1 InsertByte- 00:09:17.601 [2024-04-18 11:45:07.971109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdddfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.971143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.971201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.971218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.971280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.971296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:07.971352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:07.971368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.011237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdddfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.011269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.011329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.011346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.011405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.011426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.011484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.011500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.601 #44 NEW cov: 11857 ft: 14107 corp: 13/420b lim: 40 exec/s: 44 rss: 232Mb L: 39/40 MS: 1 ChangeBit- 00:09:17.601 [2024-04-18 11:45:08.058094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:7dfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.058127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.058190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.058210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.058284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.058300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.058361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.058378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.108197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:7dfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.108230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.108288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.108305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.108361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.108377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.601 [2024-04-18 11:45:08.108442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.601 [2024-04-18 11:45:08.108459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:17.602 #46 NEW cov: 11857 ft: 14156 corp: 14/455b lim: 40 exec/s: 46 rss: 234Mb L: 35/40 MS: 1 ShuffleBytes- 00:09:17.861 [2024-04-18 11:45:08.155055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.155090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.155167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdff cdw11:fffffffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.155185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.205260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.205294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.205372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdff cdw11:fffffffd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.205390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.861 #48 NEW cov: 11857 ft: 14188 corp: 15/476b lim: 40 exec/s: 48 rss: 235Mb L: 21/40 MS: 1 EraseBytes- 00:09:17.861 [2024-04-18 11:45:08.268038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.268075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.268134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.268151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.268212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:acacacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.268229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.861 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:17.861 [2024-04-18 11:45:08.308098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.308131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.308190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.308207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.308264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:acacacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.308280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.861 #50 NEW cov: 11874 ft: 14410 corp: 16/501b lim: 40 exec/s: 50 rss: 237Mb L: 25/40 MS: 1 EraseBytes- 00:09:17.861 [2024-04-18 11:45:08.369328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.369363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.369442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.369461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.369519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:24acacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.369536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:17.861 [2024-04-18 11:45:08.369592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:17.861 [2024-04-18 11:45:08.369609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.419475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.419508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.419567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.419588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.419646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:24acacff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.419663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.419720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.419736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:18.121 #52 NEW cov: 11874 ft: 14435 corp: 17/536b lim: 40 exec/s: 52 rss: 238Mb L: 35/40 MS: 1 ChangeByte- 00:09:18.121 [2024-04-18 11:45:08.469543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fd7dfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.469577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.469637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.469654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.469711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.469727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.469784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.469800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.519646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:04fdfdfd cdw11:fd7dfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.519678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.519738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.519755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.519813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.519829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.519884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.519900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:18.121 #54 NEW cov: 11874 ft: 14478 corp: 18/571b lim: 40 exec/s: 54 rss: 239Mb L: 35/40 MS: 1 ShuffleBytes- 00:09:18.121 [2024-04-18 11:45:08.581455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.581491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.581557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.581575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.581646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:24acaeff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.581663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.581719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.581735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.631652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.631684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.631750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:acacacac cdw11:acacacac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.631768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.631826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:24acaeff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.631843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:18.121 [2024-04-18 11:45:08.631900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:18.121 [2024-04-18 11:45:08.631916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:18.121 #56 NEW cov: 11874 ft: 14565 corp: 19/606b lim: 40 exec/s: 28 rss: 240Mb L: 35/40 MS: 1 ChangeBit- 00:09:18.121 #56 DONE cov: 11874 ft: 14565 corp: 19/606b lim: 40 exec/s: 28 rss: 240Mb 00:09:18.121 Done 56 runs in 2 second(s) 00:09:18.690 11:45:09 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:09:18.690 11:45:09 -- ../common.sh@72 -- # (( i++ )) 00:09:18.690 11:45:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:18.690 11:45:09 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:09:18.690 11:45:09 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:09:18.690 11:45:09 -- nvmf/run.sh@24 -- # local timen=1 00:09:18.690 11:45:09 -- nvmf/run.sh@25 -- # local core=0x1 00:09:18.690 11:45:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:18.690 11:45:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:09:18.690 11:45:09 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:18.690 11:45:09 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:18.690 11:45:09 -- nvmf/run.sh@34 -- # printf %02d 14 00:09:18.690 11:45:09 -- nvmf/run.sh@34 -- # port=4414 00:09:18.690 11:45:09 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:18.691 11:45:09 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:09:18.691 11:45:09 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:18.691 11:45:09 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:18.691 11:45:09 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:18.691 11:45:09 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:09:18.691 [2024-04-18 11:45:09.148593] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:18.691 [2024-04-18 11:45:09.148688] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid372141 ] 00:09:18.691 EAL: No free 2048 kB hugepages reported on node 1 00:09:18.950 [2024-04-18 11:45:09.413668] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.208 [2024-04-18 11:45:09.568750] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.466 [2024-04-18 11:45:09.812960] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:19.466 [2024-04-18 11:45:09.829180] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:09:19.466 INFO: Running with entropic power schedule (0xFF, 100). 00:09:19.466 INFO: Seed: 4103322394 00:09:19.466 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:19.466 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:19.466 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:09:19.466 INFO: A corpus is not provided, starting from an empty corpus 00:09:19.466 #2 INITED exec/s: 0 rss: 200Mb 00:09:19.466 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:19.466 This may also happen if the target rejected all inputs we tried so far 00:09:19.466 [2024-04-18 11:45:09.895040] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.466 [2024-04-18 11:45:09.895077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:19.725 NEW_FUNC[1/671]: 0x55f8e0 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:09:19.725 NEW_FUNC[2/671]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:19.725 [2024-04-18 11:45:10.268774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:19.725 [2024-04-18 11:45:10.268835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.001 #6 NEW cov: 11729 ft: 11701 corp: 2/10b lim: 35 exec/s: 0 rss: 216Mb L: 9/9 MS: 3 CrossOver-ChangeBit-CMP- DE: "\377\003\375\347K\031\337."- 00:09:20.001 [2024-04-18 11:45:10.347633] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000a2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.001 [2024-04-18 11:45:10.347684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.001 [2024-04-18 11:45:10.407814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000a2 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.001 [2024-04-18 11:45:10.407847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.001 #8 NEW cov: 11753 ft: 12163 corp: 3/19b lim: 35 exec/s: 0 rss: 218Mb L: 9/9 MS: 1 ChangeByte- 00:09:20.001 [2024-04-18 11:45:10.478590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.001 [2024-04-18 11:45:10.478636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.001 [2024-04-18 11:45:10.528572] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.001 [2024-04-18 11:45:10.528605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.266 #12 NEW cov: 11765 ft: 12331 corp: 4/27b lim: 35 exec/s: 0 rss: 220Mb L: 8/9 MS: 3 EraseBytes-ChangeBit-CopyPart- 00:09:20.266 [2024-04-18 11:45:10.598879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.266 [2024-04-18 11:45:10.598914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.266 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:20.266 [2024-04-18 11:45:10.649066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.266 [2024-04-18 11:45:10.649100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.266 #14 NEW cov: 11868 ft: 12696 corp: 5/34b lim: 35 exec/s: 0 rss: 221Mb L: 7/9 MS: 1 EraseBytes- 00:09:20.266 [2024-04-18 11:45:10.715172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.266 [2024-04-18 11:45:10.715205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.266 [2024-04-18 11:45:10.775481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.266 [2024-04-18 11:45:10.775511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.266 #16 NEW cov: 11868 ft: 12797 corp: 6/45b lim: 35 exec/s: 0 rss: 223Mb L: 11/11 MS: 1 CopyPart- 00:09:20.526 [2024-04-18 11:45:10.835109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.835141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.835232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.835251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.835354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.835375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.835476] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.835496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.835585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.835605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.895393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.895427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.895529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.895550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.895644] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.895670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.895755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.895775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.895860] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000037 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.895877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:09:20.526 #18 NEW cov: 11868 ft: 13818 corp: 7/80b lim: 35 exec/s: 18 rss: 224Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:09:20.526 [2024-04-18 11:45:10.963383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.963430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.963537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000002e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.963567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:10.963657] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:10.963679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:11.013686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:11.013717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:11.013821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000002e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:11.013840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.526 [2024-04-18 11:45:11.013931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.526 [2024-04-18 11:45:11.013950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:20.526 #20 NEW cov: 11868 ft: 14109 corp: 8/105b lim: 35 exec/s: 20 rss: 225Mb L: 25/35 MS: 1 InsertRepeatedBytes- 00:09:20.785 [2024-04-18 11:45:11.078234] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.078269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.785 [2024-04-18 11:45:11.078364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.078402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.785 NEW_FUNC[1/1]: 0x585260 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:09:20.785 [2024-04-18 11:45:11.128510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.128544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.785 [2024-04-18 11:45:11.128643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.128661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.785 #24 NEW cov: 11885 ft: 14339 corp: 9/124b lim: 35 exec/s: 24 rss: 227Mb L: 19/35 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:09:20.785 [2024-04-18 11:45:11.186729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.186766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.785 [2024-04-18 11:45:11.186871] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.186892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.785 [2024-04-18 11:45:11.247076] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.247109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:20.785 [2024-04-18 11:45:11.247211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.247233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:20.785 #26 NEW cov: 11885 ft: 14399 corp: 10/143b lim: 35 exec/s: 26 rss: 228Mb L: 19/35 MS: 1 ChangeBinInt- 00:09:20.785 [2024-04-18 11:45:11.307783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:20.785 [2024-04-18 11:45:11.307825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.044 [2024-04-18 11:45:11.368104] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.044 [2024-04-18 11:45:11.368135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.044 #28 NEW cov: 11885 ft: 14458 corp: 11/152b lim: 35 exec/s: 28 rss: 229Mb L: 9/35 MS: 1 PersAutoDict- DE: "\377\003\375\347K\031\337."- 00:09:21.044 [2024-04-18 11:45:11.436098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.044 [2024-04-18 11:45:11.436141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.044 [2024-04-18 11:45:11.486424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.044 [2024-04-18 11:45:11.486455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.044 #30 NEW cov: 11885 ft: 14478 corp: 12/161b lim: 35 exec/s: 30 rss: 231Mb L: 9/35 MS: 1 ChangeBit- 00:09:21.044 [2024-04-18 11:45:11.558346] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.044 [2024-04-18 11:45:11.558393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.044 [2024-04-18 11:45:11.558513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.044 [2024-04-18 11:45:11.558542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:21.044 [2024-04-18 11:45:11.558648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.044 [2024-04-18 11:45:11.558676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:21.044 [2024-04-18 11:45:11.558776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.044 [2024-04-18 11:45:11.558804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:21.303 [2024-04-18 11:45:11.618612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.303 [2024-04-18 11:45:11.618648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.303 [2024-04-18 11:45:11.618743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.303 [2024-04-18 11:45:11.618764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:21.303 [2024-04-18 11:45:11.618850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.303 [2024-04-18 11:45:11.618872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:21.303 [2024-04-18 11:45:11.618971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.303 [2024-04-18 11:45:11.618991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:21.303 #34 NEW cov: 11885 ft: 14592 corp: 13/192b lim: 35 exec/s: 34 rss: 232Mb L: 31/35 MS: 3 EraseBytes-ChangeBinInt-InsertRepeatedBytes- 00:09:21.303 [2024-04-18 11:45:11.689154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.303 [2024-04-18 11:45:11.689197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.303 [2024-04-18 11:45:11.749527] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.303 [2024-04-18 11:45:11.749560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.303 #36 NEW cov: 11885 ft: 14618 corp: 14/199b lim: 35 exec/s: 36 rss: 234Mb L: 7/35 MS: 1 ChangeBit- 00:09:21.303 [2024-04-18 11:45:11.820051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.303 [2024-04-18 11:45:11.820090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.562 [2024-04-18 11:45:11.870364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:21.562 [2024-04-18 11:45:11.870396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:21.562 #38 NEW cov: 11885 ft: 14656 corp: 15/211b lim: 35 exec/s: 19 rss: 234Mb L: 12/35 MS: 1 CrossOver- 00:09:21.562 #38 DONE cov: 11885 ft: 14656 corp: 15/211b lim: 35 exec/s: 19 rss: 234Mb 00:09:21.562 ###### Recommended dictionary. ###### 00:09:21.562 "\377\003\375\347K\031\337." # Uses: 1 00:09:21.562 ###### End of recommended dictionary. ###### 00:09:21.562 Done 38 runs in 2 second(s) 00:09:21.820 11:45:12 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:09:21.820 11:45:12 -- ../common.sh@72 -- # (( i++ )) 00:09:21.820 11:45:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:21.820 11:45:12 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:09:21.820 11:45:12 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:09:21.820 11:45:12 -- nvmf/run.sh@24 -- # local timen=1 00:09:21.820 11:45:12 -- nvmf/run.sh@25 -- # local core=0x1 00:09:21.820 11:45:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:21.820 11:45:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:09:21.820 11:45:12 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:21.820 11:45:12 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:21.820 11:45:12 -- nvmf/run.sh@34 -- # printf %02d 15 00:09:21.820 11:45:12 -- nvmf/run.sh@34 -- # port=4415 00:09:21.820 11:45:12 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:21.820 11:45:12 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:09:21.820 11:45:12 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:21.820 11:45:12 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:21.820 11:45:12 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:21.820 11:45:12 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:09:22.079 [2024-04-18 11:45:12.388567] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:22.079 [2024-04-18 11:45:12.388657] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid372553 ] 00:09:22.079 EAL: No free 2048 kB hugepages reported on node 1 00:09:22.338 [2024-04-18 11:45:12.641986] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.338 [2024-04-18 11:45:12.796635] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.596 [2024-04-18 11:45:13.046552] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:22.596 [2024-04-18 11:45:13.062802] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:09:22.596 INFO: Running with entropic power schedule (0xFF, 100). 00:09:22.596 INFO: Seed: 3042389008 00:09:22.596 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:22.596 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:22.596 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:09:22.596 INFO: A corpus is not provided, starting from an empty corpus 00:09:22.596 #2 INITED exec/s: 0 rss: 199Mb 00:09:22.596 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:22.596 This may also happen if the target rejected all inputs we tried so far 00:09:23.113 NEW_FUNC[1/657]: 0x561030 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:09:23.114 NEW_FUNC[2/657]: 0x585260 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:09:23.114 #7 NEW cov: 11601 ft: 11568 corp: 2/8b lim: 35 exec/s: 0 rss: 216Mb L: 7/7 MS: 4 InsertRepeatedBytes-ChangeBit-ChangeByte-InsertByte- 00:09:23.114 [2024-04-18 11:45:13.581666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.114 [2024-04-18 11:45:13.581725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.114 NEW_FUNC[1/14]: 0x1a4aa60 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:09:23.114 NEW_FUNC[2/14]: 0x1a4ace0 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:09:23.114 [2024-04-18 11:45:13.631869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.114 [2024-04-18 11:45:13.631912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.114 #11 NEW cov: 11755 ft: 12189 corp: 3/19b lim: 35 exec/s: 0 rss: 218Mb L: 11/11 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:09:23.372 #18 NEW cov: 11767 ft: 12371 corp: 4/26b lim: 35 exec/s: 0 rss: 220Mb L: 7/11 MS: 1 ChangeBit- 00:09:23.372 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:23.372 #20 NEW cov: 11870 ft: 12662 corp: 5/33b lim: 35 exec/s: 0 rss: 221Mb L: 7/11 MS: 1 ChangeByte- 00:09:23.630 #22 NEW cov: 11870 ft: 12834 corp: 6/41b lim: 35 exec/s: 0 rss: 223Mb L: 8/11 MS: 1 InsertByte- 00:09:23.630 [2024-04-18 11:45:14.068971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.630 [2024-04-18 11:45:14.069017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.630 [2024-04-18 11:45:14.129113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.630 [2024-04-18 11:45:14.129146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.630 #27 NEW cov: 11870 ft: 13044 corp: 7/48b lim: 35 exec/s: 27 rss: 224Mb L: 7/11 MS: 4 EraseBytes-ShuffleBytes-ChangeBinInt-CrossOver- 00:09:23.889 [2024-04-18 11:45:14.190144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000032 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.889 [2024-04-18 11:45:14.190179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.889 [2024-04-18 11:45:14.240487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000032 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.889 [2024-04-18 11:45:14.240521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.889 #29 NEW cov: 11870 ft: 13192 corp: 8/57b lim: 35 exec/s: 29 rss: 226Mb L: 9/11 MS: 1 InsertByte- 00:09:23.889 [2024-04-18 11:45:14.300637] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000024a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.889 [2024-04-18 11:45:14.300670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.889 [2024-04-18 11:45:14.350707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000024a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:23.890 [2024-04-18 11:45:14.350741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:23.890 #31 NEW cov: 11870 ft: 13284 corp: 9/64b lim: 35 exec/s: 31 rss: 227Mb L: 7/11 MS: 1 ChangeBit- 00:09:24.148 #38 NEW cov: 11870 ft: 13330 corp: 10/71b lim: 35 exec/s: 38 rss: 228Mb L: 7/11 MS: 1 ChangeByte- 00:09:24.148 [2024-04-18 11:45:14.539257] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.148 [2024-04-18 11:45:14.539295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.148 [2024-04-18 11:45:14.539386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000022c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.148 [2024-04-18 11:45:14.539406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.148 [2024-04-18 11:45:14.599179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.148 [2024-04-18 11:45:14.599214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.148 [2024-04-18 11:45:14.599309] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000022c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.148 [2024-04-18 11:45:14.599327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.148 #40 NEW cov: 11870 ft: 13779 corp: 11/89b lim: 35 exec/s: 40 rss: 229Mb L: 18/18 MS: 1 CrossOver- 00:09:24.407 #42 NEW cov: 11870 ft: 13860 corp: 12/102b lim: 35 exec/s: 42 rss: 231Mb L: 13/18 MS: 1 CrossOver- 00:09:24.407 [2024-04-18 11:45:14.779386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.407 [2024-04-18 11:45:14.779427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.407 [2024-04-18 11:45:14.779537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.407 [2024-04-18 11:45:14.779556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.407 [2024-04-18 11:45:14.779644] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.407 [2024-04-18 11:45:14.779662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.407 [2024-04-18 11:45:14.779757] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.407 [2024-04-18 11:45:14.779777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:24.408 [2024-04-18 11:45:14.829686] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.408 [2024-04-18 11:45:14.829721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:09:24.408 [2024-04-18 11:45:14.829825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.408 [2024-04-18 11:45:14.829845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.408 [2024-04-18 11:45:14.829937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.408 [2024-04-18 11:45:14.829954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:09:24.408 [2024-04-18 11:45:14.830044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007f6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.408 [2024-04-18 11:45:14.830063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:09:24.408 #44 NEW cov: 11870 ft: 14506 corp: 13/136b lim: 35 exec/s: 44 rss: 232Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:09:24.408 [2024-04-18 11:45:14.891718] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006cd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.408 [2024-04-18 11:45:14.891755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.408 [2024-04-18 11:45:14.942102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006cd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:24.408 [2024-04-18 11:45:14.942136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:09:24.667 #51 NEW cov: 11870 ft: 14576 corp: 14/154b lim: 35 exec/s: 51 rss: 233Mb L: 18/34 MS: 1 CrossOver- 00:09:24.667 #53 NEW cov: 11870 ft: 14665 corp: 15/161b lim: 35 exec/s: 53 rss: 235Mb L: 7/34 MS: 1 ChangeByte- 00:09:24.667 #55 NEW cov: 11870 ft: 14670 corp: 16/169b lim: 35 exec/s: 27 rss: 235Mb L: 8/34 MS: 1 InsertByte- 00:09:24.667 #55 DONE cov: 11870 ft: 14670 corp: 16/169b lim: 35 exec/s: 27 rss: 236Mb 00:09:24.667 Done 55 runs in 2 second(s) 00:09:25.235 11:45:15 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:09:25.235 11:45:15 -- ../common.sh@72 -- # (( i++ )) 00:09:25.235 11:45:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:25.235 11:45:15 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:09:25.235 11:45:15 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:09:25.235 11:45:15 -- nvmf/run.sh@24 -- # local timen=1 00:09:25.235 11:45:15 -- nvmf/run.sh@25 -- # local core=0x1 00:09:25.235 11:45:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:25.235 11:45:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:09:25.235 11:45:15 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:25.235 11:45:15 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:25.235 11:45:15 -- nvmf/run.sh@34 -- # printf %02d 16 00:09:25.235 11:45:15 -- nvmf/run.sh@34 -- # port=4416 00:09:25.235 11:45:15 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:25.235 11:45:15 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:09:25.235 11:45:15 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:25.235 11:45:15 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:25.235 11:45:15 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:25.235 11:45:15 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:09:25.235 [2024-04-18 11:45:15.690658] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:25.235 [2024-04-18 11:45:15.690767] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid372988 ] 00:09:25.235 EAL: No free 2048 kB hugepages reported on node 1 00:09:25.495 [2024-04-18 11:45:15.977168] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:25.754 [2024-04-18 11:45:16.130772] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.013 [2024-04-18 11:45:16.376930] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:26.013 [2024-04-18 11:45:16.393154] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:09:26.013 INFO: Running with entropic power schedule (0xFF, 100). 00:09:26.013 INFO: Seed: 2076400178 00:09:26.013 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:26.013 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:26.013 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:09:26.013 INFO: A corpus is not provided, starting from an empty corpus 00:09:26.013 #2 INITED exec/s: 0 rss: 199Mb 00:09:26.013 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:26.013 This may also happen if the target rejected all inputs we tried so far 00:09:26.013 [2024-04-18 11:45:16.448820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.013 [2024-04-18 11:45:16.448858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.273 NEW_FUNC[1/670]: 0x5626f0 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:09:26.273 NEW_FUNC[2/670]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:26.273 [2024-04-18 11:45:16.790984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.273 [2024-04-18 11:45:16.791042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.273 #18 NEW cov: 11801 ft: 11795 corp: 2/32b lim: 105 exec/s: 0 rss: 216Mb L: 31/31 MS: 5 InsertByte-EraseBytes-CrossOver-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:26.533 [2024-04-18 11:45:16.839171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:16.839212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.533 NEW_FUNC[1/1]: 0x1da0b70 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:09:26.533 [2024-04-18 11:45:16.889346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:16.889383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.533 #20 NEW cov: 11846 ft: 12332 corp: 3/63b lim: 105 exec/s: 0 rss: 219Mb L: 31/31 MS: 1 CMP- DE: "nvme"- 00:09:26.533 [2024-04-18 11:45:16.930636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584689304 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:16.930673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.533 [2024-04-18 11:45:16.930735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744071488182527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:16.930755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.533 [2024-04-18 11:45:16.970681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584689304 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:16.970717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.533 [2024-04-18 11:45:16.970788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744071488182527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:16.970808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.533 #24 NEW cov: 11858 ft: 13227 corp: 4/114b lim: 105 exec/s: 0 rss: 220Mb L: 51/51 MS: 3 EraseBytes-ChangeByte-CrossOver- 00:09:26.533 [2024-04-18 11:45:17.011975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:17.012010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.533 [2024-04-18 11:45:17.062073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.533 [2024-04-18 11:45:17.062108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.793 #26 NEW cov: 11944 ft: 13501 corp: 5/145b lim: 105 exec/s: 0 rss: 221Mb L: 31/51 MS: 1 ChangeBinInt- 00:09:26.793 [2024-04-18 11:45:17.122308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.122344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.793 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:26.793 [2024-04-18 11:45:17.162397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.162438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.793 #28 NEW cov: 11961 ft: 13648 corp: 6/176b lim: 105 exec/s: 0 rss: 223Mb L: 31/51 MS: 1 ShuffleBytes- 00:09:26.793 [2024-04-18 11:45:17.203884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584689304 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.203920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.793 [2024-04-18 11:45:17.203991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744071488182527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.204010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.793 [2024-04-18 11:45:17.253940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584689304 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.253975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.793 [2024-04-18 11:45:17.254042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744071488182527 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.254063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.793 #30 NEW cov: 11961 ft: 13709 corp: 7/227b lim: 105 exec/s: 0 rss: 225Mb L: 51/51 MS: 1 ChangeBinInt- 00:09:26.793 [2024-04-18 11:45:17.296047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:27298656793835008 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.296081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:26.793 [2024-04-18 11:45:17.296140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18157383382357244923 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.296159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:26.793 [2024-04-18 11:45:17.296216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18157383382357244923 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.296234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:26.793 [2024-04-18 11:45:17.296291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18157383382357244923 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:26.793 [2024-04-18 11:45:17.296309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.052 [2024-04-18 11:45:17.346184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:27298656793835008 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.346218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.052 [2024-04-18 11:45:17.346277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18157383382357244923 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.346296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.052 [2024-04-18 11:45:17.346356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18157383382357244923 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.346374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.052 [2024-04-18 11:45:17.346435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18157383382357244923 len:64508 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.346453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.052 #34 NEW cov: 11961 ft: 14390 corp: 8/315b lim: 105 exec/s: 0 rss: 226Mb L: 88/88 MS: 3 CMP-ShuffleBytes-InsertRepeatedBytes- DE: "\325\266\000\000``\000\000"- 00:09:27.052 [2024-04-18 11:45:17.406529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446630379895455743 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.406565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.052 [2024-04-18 11:45:17.446595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446630379895455743 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.446629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.052 #38 NEW cov: 11961 ft: 14453 corp: 9/338b lim: 105 exec/s: 38 rss: 227Mb L: 23/88 MS: 3 CrossOver-CrossOver-CMP- DE: "\015\000\000\000"- 00:09:27.052 [2024-04-18 11:45:17.488035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.488070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.052 [2024-04-18 11:45:17.538223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.538257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.052 #40 NEW cov: 11961 ft: 14488 corp: 10/373b lim: 105 exec/s: 40 rss: 229Mb L: 35/88 MS: 1 PersAutoDict- DE: "nvme"- 00:09:27.052 [2024-04-18 11:45:17.584939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.052 [2024-04-18 11:45:17.584975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.311 [2024-04-18 11:45:17.635067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.311 [2024-04-18 11:45:17.635103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.311 #42 NEW cov: 11961 ft: 14522 corp: 11/408b lim: 105 exec/s: 42 rss: 230Mb L: 35/88 MS: 1 CMP- DE: "acce"- 00:09:27.311 [2024-04-18 11:45:17.681314] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.311 [2024-04-18 11:45:17.681350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.681434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.681454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.681506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.681524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.731449] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.731482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.731537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.731556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.731610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.731628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.312 #44 NEW cov: 11961 ft: 14813 corp: 12/486b lim: 105 exec/s: 44 rss: 231Mb L: 78/88 MS: 1 InsertRepeatedBytes- 00:09:27.312 [2024-04-18 11:45:17.773825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:936748724211578477 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.773859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.773903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.773921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.773974] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.773992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.774043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.774061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.813912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:936748724211578477 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.813946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.813984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.814003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.814054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.814071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.312 [2024-04-18 11:45:17.814126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.814144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.312 #50 NEW cov: 11961 ft: 14848 corp: 13/583b lim: 105 exec/s: 50 rss: 233Mb L: 97/97 MS: 5 ChangeByte-PersAutoDict-ChangeByte-PersAutoDict-InsertRepeatedBytes- DE: "nvme"-"\015\000\000\000"- 00:09:27.312 [2024-04-18 11:45:17.855863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.312 [2024-04-18 11:45:17.855898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.571 [2024-04-18 11:45:17.895961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.571 [2024-04-18 11:45:17.895994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.571 #57 NEW cov: 11961 ft: 14883 corp: 14/614b lim: 105 exec/s: 57 rss: 234Mb L: 31/97 MS: 1 ChangeByte- 00:09:27.571 [2024-04-18 11:45:17.943584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.571 [2024-04-18 11:45:17.943619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.571 [2024-04-18 11:45:17.983657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.571 [2024-04-18 11:45:17.983692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.571 #59 NEW cov: 11961 ft: 14962 corp: 15/645b lim: 105 exec/s: 59 rss: 235Mb L: 31/97 MS: 1 ChangeByte- 00:09:27.571 [2024-04-18 11:45:18.025613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.571 [2024-04-18 11:45:18.025650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.571 [2024-04-18 11:45:18.075712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.571 [2024-04-18 11:45:18.075746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.571 #61 NEW cov: 11961 ft: 14981 corp: 16/677b lim: 105 exec/s: 61 rss: 237Mb L: 32/97 MS: 1 InsertByte- 00:09:27.571 [2024-04-18 11:45:18.117775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446630379895455743 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.571 [2024-04-18 11:45:18.117814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.167859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446630379895455743 len:39065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.167895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.831 #63 NEW cov: 11961 ft: 15013 corp: 17/701b lim: 105 exec/s: 63 rss: 238Mb L: 24/97 MS: 1 InsertByte- 00:09:27.831 [2024-04-18 11:45:18.226343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.226379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.226463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.226483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.226537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.226566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.226620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.226637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.276422] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.276457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.276512] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.276531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.276597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.276614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.276668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.276686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:27.831 #65 NEW cov: 11961 ft: 15055 corp: 18/799b lim: 105 exec/s: 65 rss: 239Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:09:27.831 [2024-04-18 11:45:18.318951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.318987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.831 [2024-04-18 11:45:18.359095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069590456472 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:27.831 [2024-04-18 11:45:18.359129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:27.831 #67 NEW cov: 11961 ft: 15082 corp: 19/830b lim: 105 exec/s: 67 rss: 240Mb L: 31/98 MS: 1 PersAutoDict- DE: "\325\266\000\000``\000\000"- 00:09:28.091 [2024-04-18 11:45:18.401238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.091 [2024-04-18 11:45:18.401276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.091 [2024-04-18 11:45:18.451344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446465897443596440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:09:28.091 [2024-04-18 11:45:18.451378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:28.091 #69 NEW cov: 11961 ft: 15102 corp: 20/862b lim: 105 exec/s: 34 rss: 242Mb L: 32/98 MS: 1 ShuffleBytes- 00:09:28.091 #69 DONE cov: 11961 ft: 15102 corp: 20/862b lim: 105 exec/s: 34 rss: 242Mb 00:09:28.091 ###### Recommended dictionary. ###### 00:09:28.091 "nvme" # Uses: 2 00:09:28.091 "\325\266\000\000``\000\000" # Uses: 1 00:09:28.091 "\015\000\000\000" # Uses: 1 00:09:28.091 "acce" # Uses: 0 00:09:28.091 ###### End of recommended dictionary. ###### 00:09:28.091 Done 69 runs in 2 second(s) 00:09:28.659 11:45:18 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:09:28.659 11:45:18 -- ../common.sh@72 -- # (( i++ )) 00:09:28.659 11:45:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:28.659 11:45:18 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:09:28.659 11:45:18 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:09:28.659 11:45:18 -- nvmf/run.sh@24 -- # local timen=1 00:09:28.659 11:45:18 -- nvmf/run.sh@25 -- # local core=0x1 00:09:28.659 11:45:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:28.659 11:45:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:09:28.659 11:45:18 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:28.659 11:45:18 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:28.659 11:45:18 -- nvmf/run.sh@34 -- # printf %02d 17 00:09:28.659 11:45:18 -- nvmf/run.sh@34 -- # port=4417 00:09:28.659 11:45:18 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:28.659 11:45:18 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:09:28.659 11:45:18 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:28.659 11:45:18 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:28.659 11:45:18 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:28.659 11:45:18 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:09:28.659 [2024-04-18 11:45:18.967429] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:28.659 [2024-04-18 11:45:18.967525] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid373477 ] 00:09:28.659 EAL: No free 2048 kB hugepages reported on node 1 00:09:28.918 [2024-04-18 11:45:19.235480] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.919 [2024-04-18 11:45:19.386725] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.178 [2024-04-18 11:45:19.632881] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:29.178 [2024-04-18 11:45:19.649110] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:09:29.178 INFO: Running with entropic power schedule (0xFF, 100). 00:09:29.178 INFO: Seed: 1037447918 00:09:29.178 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:29.178 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:29.178 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:09:29.178 INFO: A corpus is not provided, starting from an empty corpus 00:09:29.178 #2 INITED exec/s: 0 rss: 199Mb 00:09:29.178 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:29.178 This may also happen if the target rejected all inputs we tried so far 00:09:29.178 [2024-04-18 11:45:19.705142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.178 [2024-04-18 11:45:19.705183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.178 [2024-04-18 11:45:19.705231] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.178 [2024-04-18 11:45:19.705255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.178 [2024-04-18 11:45:19.705307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.178 [2024-04-18 11:45:19.705324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.697 NEW_FUNC[1/672]: 0x5661e0 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:09:29.697 NEW_FUNC[2/672]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:29.697 [2024-04-18 11:45:20.077734] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.697 [2024-04-18 11:45:20.077804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.697 [2024-04-18 11:45:20.077909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.697 [2024-04-18 11:45:20.077937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.697 [2024-04-18 11:45:20.078039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.697 [2024-04-18 11:45:20.078065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.697 #16 NEW cov: 11843 ft: 11816 corp: 2/91b lim: 120 exec/s: 0 rss: 217Mb L: 90/90 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:09:29.697 [2024-04-18 11:45:20.158074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.697 [2024-04-18 11:45:20.158123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.697 [2024-04-18 11:45:20.218214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.697 [2024-04-18 11:45:20.218252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.697 #19 NEW cov: 11867 ft: 13142 corp: 3/128b lim: 120 exec/s: 0 rss: 218Mb L: 37/90 MS: 2 CrossOver-InsertRepeatedBytes- 00:09:29.957 [2024-04-18 11:45:20.280650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.280686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.280774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.280794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.280884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.280906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.340796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.340834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.340906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.340927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.341024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.341045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.957 #21 NEW cov: 11879 ft: 13319 corp: 4/222b lim: 120 exec/s: 0 rss: 220Mb L: 94/94 MS: 1 CMP- DE: "\377\377\3773"- 00:09:29.957 [2024-04-18 11:45:20.415281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.415322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.415420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.415447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.415539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.415563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.957 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:29.957 [2024-04-18 11:45:20.465382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.465420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.465493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.465516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:29.957 [2024-04-18 11:45:20.465570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:29.957 [2024-04-18 11:45:20.465589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:29.957 #23 NEW cov: 11982 ft: 13635 corp: 5/312b lim: 120 exec/s: 0 rss: 221Mb L: 90/94 MS: 1 ChangeByte- 00:09:30.216 [2024-04-18 11:45:20.527298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599068159 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.216 [2024-04-18 11:45:20.527339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.216 [2024-04-18 11:45:20.527394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.216 [2024-04-18 11:45:20.527425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.216 [2024-04-18 11:45:20.527518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.216 [2024-04-18 11:45:20.527545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.216 [2024-04-18 11:45:20.577392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599068159 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.216 [2024-04-18 11:45:20.577430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.216 [2024-04-18 11:45:20.577507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.216 [2024-04-18 11:45:20.577529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.216 [2024-04-18 11:45:20.577600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.216 [2024-04-18 11:45:20.577619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.216 #25 NEW cov: 11982 ft: 13747 corp: 6/402b lim: 120 exec/s: 0 rss: 223Mb L: 90/94 MS: 1 ChangeBit- 00:09:30.216 [2024-04-18 11:45:20.648499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:6727636073062596445 len:23902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.217 [2024-04-18 11:45:20.648543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.217 [2024-04-18 11:45:20.648640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:6727636073941130589 len:23902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.217 [2024-04-18 11:45:20.648664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.217 [2024-04-18 11:45:20.648803] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:6727636073941130589 len:23902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.217 [2024-04-18 11:45:20.648831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.217 [2024-04-18 11:45:20.698772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:6727636073062596445 len:23902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.217 [2024-04-18 11:45:20.698806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.217 [2024-04-18 11:45:20.698899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:6727636073941130589 len:23902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.217 [2024-04-18 11:45:20.698920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.217 [2024-04-18 11:45:20.699010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:6727636073941130589 len:23902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.217 [2024-04-18 11:45:20.699040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.217 #31 NEW cov: 11982 ft: 13812 corp: 7/489b lim: 120 exec/s: 31 rss: 224Mb L: 87/94 MS: 5 ChangeByte-CopyPart-ChangeByte-PersAutoDict-InsertRepeatedBytes- DE: "\377\377\3773"- 00:09:30.217 [2024-04-18 11:45:20.763521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.217 [2024-04-18 11:45:20.763565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.476 [2024-04-18 11:45:20.823471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:20.823508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.476 #33 NEW cov: 11982 ft: 13908 corp: 8/526b lim: 120 exec/s: 33 rss: 226Mb L: 37/94 MS: 1 CopyPart- 00:09:30.476 [2024-04-18 11:45:20.895903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:20.895939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.476 [2024-04-18 11:45:20.896017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:20.896038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.476 [2024-04-18 11:45:20.896122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:20.896142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.476 [2024-04-18 11:45:20.956163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:20.956197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.476 [2024-04-18 11:45:20.956268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:20.956291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.476 [2024-04-18 11:45:20.956352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:20.956375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.476 #35 NEW cov: 11982 ft: 13944 corp: 9/605b lim: 120 exec/s: 35 rss: 227Mb L: 79/94 MS: 1 EraseBytes- 00:09:30.476 [2024-04-18 11:45:21.021031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:21.021075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.476 [2024-04-18 11:45:21.021157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.476 [2024-04-18 11:45:21.021181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.071383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.071422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.071522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.071543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.736 #37 NEW cov: 11982 ft: 14272 corp: 10/653b lim: 120 exec/s: 37 rss: 229Mb L: 48/94 MS: 1 EraseBytes- 00:09:30.736 [2024-04-18 11:45:21.143250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.143291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.143394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.143424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.143515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.143539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.203254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.203290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.203379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.203402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.203500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.203523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:30.736 #39 NEW cov: 11982 ft: 14411 corp: 11/732b lim: 120 exec/s: 39 rss: 230Mb L: 79/94 MS: 1 ChangeBit- 00:09:30.736 [2024-04-18 11:45:21.273488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.273525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.736 [2024-04-18 11:45:21.273602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.736 [2024-04-18 11:45:21.273625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.995 [2024-04-18 11:45:21.323695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.995 [2024-04-18 11:45:21.323732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.995 [2024-04-18 11:45:21.323831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.995 [2024-04-18 11:45:21.323850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:30.995 #41 NEW cov: 11982 ft: 14446 corp: 12/790b lim: 120 exec/s: 41 rss: 231Mb L: 58/94 MS: 1 EraseBytes- 00:09:30.995 [2024-04-18 11:45:21.395431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.995 [2024-04-18 11:45:21.395472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.996 [2024-04-18 11:45:21.455672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.996 [2024-04-18 11:45:21.455709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.996 #43 NEW cov: 11982 ft: 14586 corp: 13/821b lim: 120 exec/s: 43 rss: 233Mb L: 31/94 MS: 1 CrossOver- 00:09:30.996 [2024-04-18 11:45:21.513548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.996 [2024-04-18 11:45:21.513581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:30.996 [2024-04-18 11:45:21.513683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5136152271503443783 len:18248 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:30.996 [2024-04-18 11:45:21.513701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.255 [2024-04-18 11:45:21.573715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.255 [2024-04-18 11:45:21.573747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.255 [2024-04-18 11:45:21.573815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5136152271503443783 len:18248 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.255 [2024-04-18 11:45:21.573836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:31.255 #45 NEW cov: 11982 ft: 14668 corp: 14/892b lim: 120 exec/s: 45 rss: 234Mb L: 71/94 MS: 1 InsertRepeatedBytes- 00:09:31.255 [2024-04-18 11:45:21.634838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.255 [2024-04-18 11:45:21.634882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.255 [2024-04-18 11:45:21.685386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14106333700309042115 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:31.255 [2024-04-18 11:45:21.685423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:31.255 #47 NEW cov: 11982 ft: 14676 corp: 15/929b lim: 120 exec/s: 23 rss: 235Mb L: 37/94 MS: 1 CMP- DE: "RG_ring_2_373477"- 00:09:31.255 #47 DONE cov: 11982 ft: 14676 corp: 15/929b lim: 120 exec/s: 23 rss: 235Mb 00:09:31.255 ###### Recommended dictionary. ###### 00:09:31.255 "\377\377\3773" # Uses: 1 00:09:31.255 "RG_ring_2_373477" # Uses: 0 00:09:31.255 ###### End of recommended dictionary. ###### 00:09:31.255 Done 47 runs in 2 second(s) 00:09:31.823 11:45:22 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:09:31.824 11:45:22 -- ../common.sh@72 -- # (( i++ )) 00:09:31.824 11:45:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:31.824 11:45:22 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:09:31.824 11:45:22 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:09:31.824 11:45:22 -- nvmf/run.sh@24 -- # local timen=1 00:09:31.824 11:45:22 -- nvmf/run.sh@25 -- # local core=0x1 00:09:31.824 11:45:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:31.824 11:45:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:09:31.824 11:45:22 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:31.824 11:45:22 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:31.824 11:45:22 -- nvmf/run.sh@34 -- # printf %02d 18 00:09:31.824 11:45:22 -- nvmf/run.sh@34 -- # port=4418 00:09:31.824 11:45:22 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:31.824 11:45:22 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:09:31.824 11:45:22 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:31.824 11:45:22 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:31.824 11:45:22 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:31.824 11:45:22 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:09:31.824 [2024-04-18 11:45:22.203858] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:31.824 [2024-04-18 11:45:22.203964] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid373859 ] 00:09:31.824 EAL: No free 2048 kB hugepages reported on node 1 00:09:32.083 [2024-04-18 11:45:22.479297] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.083 [2024-04-18 11:45:22.632483] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.342 [2024-04-18 11:45:22.877818] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:32.342 [2024-04-18 11:45:22.894056] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:09:32.603 INFO: Running with entropic power schedule (0xFF, 100). 00:09:32.603 INFO: Seed: 4283445095 00:09:32.603 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:32.603 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:32.603 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:09:32.603 INFO: A corpus is not provided, starting from an empty corpus 00:09:32.603 #2 INITED exec/s: 0 rss: 200Mb 00:09:32.603 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:32.603 This may also happen if the target rejected all inputs we tried so far 00:09:32.603 [2024-04-18 11:45:22.950078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:32.603 [2024-04-18 11:45:22.950115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.603 [2024-04-18 11:45:22.950155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:32.603 [2024-04-18 11:45:22.950172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.603 [2024-04-18 11:45:22.950221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:32.603 [2024-04-18 11:45:22.950238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.603 [2024-04-18 11:45:22.950283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:32.603 [2024-04-18 11:45:22.950302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.603 [2024-04-18 11:45:22.950349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:32.603 [2024-04-18 11:45:22.950364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:32.861 NEW_FUNC[1/670]: 0x56a310 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:09:32.861 NEW_FUNC[2/670]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:32.861 [2024-04-18 11:45:23.293080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:32.861 [2024-04-18 11:45:23.293131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.861 [2024-04-18 11:45:23.293207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:32.861 [2024-04-18 11:45:23.293225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.861 [2024-04-18 11:45:23.293279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:32.861 [2024-04-18 11:45:23.293295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.861 [2024-04-18 11:45:23.293344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:32.861 [2024-04-18 11:45:23.293360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.861 [2024-04-18 11:45:23.293411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:32.861 [2024-04-18 11:45:23.293438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:32.862 #17 NEW cov: 11786 ft: 11787 corp: 2/101b lim: 100 exec/s: 0 rss: 217Mb L: 100/100 MS: 4 ShuffleBytes-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:09:32.862 [2024-04-18 11:45:23.341847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:32.862 [2024-04-18 11:45:23.341885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.341962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:32.862 [2024-04-18 11:45:23.341980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.342035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:32.862 [2024-04-18 11:45:23.342051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.342116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:32.862 [2024-04-18 11:45:23.342133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.342187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:32.862 [2024-04-18 11:45:23.342204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.391898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:32.862 [2024-04-18 11:45:23.391932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.391982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:32.862 [2024-04-18 11:45:23.392000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.392055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:32.862 [2024-04-18 11:45:23.392072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.392125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:32.862 [2024-04-18 11:45:23.392141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:32.862 [2024-04-18 11:45:23.392209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:32.862 [2024-04-18 11:45:23.392230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:32.862 #19 NEW cov: 11810 ft: 12287 corp: 3/201b lim: 100 exec/s: 0 rss: 219Mb L: 100/100 MS: 1 ShuffleBytes- 00:09:33.121 [2024-04-18 11:45:23.434154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.121 [2024-04-18 11:45:23.434189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.121 [2024-04-18 11:45:23.434227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.121 [2024-04-18 11:45:23.434246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.121 [2024-04-18 11:45:23.434302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.121 [2024-04-18 11:45:23.434318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.121 [2024-04-18 11:45:23.434379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.121 [2024-04-18 11:45:23.434396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.121 [2024-04-18 11:45:23.434456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.121 [2024-04-18 11:45:23.434473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.484290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.122 [2024-04-18 11:45:23.484325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.484391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.122 [2024-04-18 11:45:23.484408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.484470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.122 [2024-04-18 11:45:23.484488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.484543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.122 [2024-04-18 11:45:23.484563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.484618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.122 [2024-04-18 11:45:23.484635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.122 #21 NEW cov: 11822 ft: 12607 corp: 4/301b lim: 100 exec/s: 0 rss: 221Mb L: 100/100 MS: 1 ChangeBinInt- 00:09:33.122 [2024-04-18 11:45:23.527298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.122 [2024-04-18 11:45:23.527333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.527369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.122 [2024-04-18 11:45:23.527385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.527443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.122 [2024-04-18 11:45:23.527466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.527522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.122 [2024-04-18 11:45:23.527538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.527597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.122 [2024-04-18 11:45:23.527614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.577453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.122 [2024-04-18 11:45:23.577486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.577541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.122 [2024-04-18 11:45:23.577560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.577615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.122 [2024-04-18 11:45:23.577632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.577688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.122 [2024-04-18 11:45:23.577705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.577766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.122 [2024-04-18 11:45:23.577784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.122 #23 NEW cov: 11908 ft: 12837 corp: 5/401b lim: 100 exec/s: 0 rss: 222Mb L: 100/100 MS: 1 ShuffleBytes- 00:09:33.122 [2024-04-18 11:45:23.638091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.122 [2024-04-18 11:45:23.638125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.638183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.122 [2024-04-18 11:45:23.638201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.638256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.122 [2024-04-18 11:45:23.638273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.638333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.122 [2024-04-18 11:45:23.638350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.122 [2024-04-18 11:45:23.638408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.122 [2024-04-18 11:45:23.638434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.122 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:33.382 [2024-04-18 11:45:23.678245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.382 [2024-04-18 11:45:23.678279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.678319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.382 [2024-04-18 11:45:23.678340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.678402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.382 [2024-04-18 11:45:23.678425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.678482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.382 [2024-04-18 11:45:23.678500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.678552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.382 [2024-04-18 11:45:23.678568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.382 #25 NEW cov: 11925 ft: 12988 corp: 6/501b lim: 100 exec/s: 0 rss: 223Mb L: 100/100 MS: 1 ChangeBinInt- 00:09:33.382 [2024-04-18 11:45:23.739270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.382 [2024-04-18 11:45:23.739303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.739360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.382 [2024-04-18 11:45:23.739378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.739437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.382 [2024-04-18 11:45:23.739455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.739508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.382 [2024-04-18 11:45:23.739525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.739585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.382 [2024-04-18 11:45:23.739602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.779423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.382 [2024-04-18 11:45:23.779456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.779507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.382 [2024-04-18 11:45:23.779525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.779578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.382 [2024-04-18 11:45:23.779595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.779648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.382 [2024-04-18 11:45:23.779665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.779723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.382 [2024-04-18 11:45:23.779740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.382 #27 NEW cov: 11925 ft: 13154 corp: 7/601b lim: 100 exec/s: 0 rss: 225Mb L: 100/100 MS: 1 CrossOver- 00:09:33.382 [2024-04-18 11:45:23.821840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.382 [2024-04-18 11:45:23.821874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.821929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.382 [2024-04-18 11:45:23.821947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.822000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.382 [2024-04-18 11:45:23.822018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.382 [2024-04-18 11:45:23.822071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.382 [2024-04-18 11:45:23.822087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.822143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.383 [2024-04-18 11:45:23.822160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.871897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.383 [2024-04-18 11:45:23.871930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.871983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.383 [2024-04-18 11:45:23.872001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.872068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.383 [2024-04-18 11:45:23.872086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.872142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.383 [2024-04-18 11:45:23.872160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.872212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.383 [2024-04-18 11:45:23.872230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.383 #29 NEW cov: 11925 ft: 13205 corp: 8/701b lim: 100 exec/s: 0 rss: 226Mb L: 100/100 MS: 1 CopyPart- 00:09:33.383 [2024-04-18 11:45:23.919670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.383 [2024-04-18 11:45:23.919705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.919757] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.383 [2024-04-18 11:45:23.919773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.919825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.383 [2024-04-18 11:45:23.919842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.919896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.383 [2024-04-18 11:45:23.919912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.383 [2024-04-18 11:45:23.919969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.383 [2024-04-18 11:45:23.919985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:23.969828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.643 [2024-04-18 11:45:23.969862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:23.969940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.643 [2024-04-18 11:45:23.969957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:23.970009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.643 [2024-04-18 11:45:23.970026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:23.970079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.643 [2024-04-18 11:45:23.970096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:23.970153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.643 [2024-04-18 11:45:23.970169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.643 #31 NEW cov: 11925 ft: 13232 corp: 9/801b lim: 100 exec/s: 31 rss: 227Mb L: 100/100 MS: 1 ChangeByte- 00:09:33.643 [2024-04-18 11:45:24.016816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.643 [2024-04-18 11:45:24.016852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.016894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.643 [2024-04-18 11:45:24.016910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.016967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.643 [2024-04-18 11:45:24.016984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.017047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.643 [2024-04-18 11:45:24.017064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.017122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.643 [2024-04-18 11:45:24.017139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.056879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.643 [2024-04-18 11:45:24.056913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.056966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.643 [2024-04-18 11:45:24.056983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.057046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.643 [2024-04-18 11:45:24.057063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.057124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.643 [2024-04-18 11:45:24.057141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.057196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.643 [2024-04-18 11:45:24.057214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.643 #33 NEW cov: 11925 ft: 13313 corp: 10/901b lim: 100 exec/s: 33 rss: 229Mb L: 100/100 MS: 1 ChangeBit- 00:09:33.643 [2024-04-18 11:45:24.116696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.643 [2024-04-18 11:45:24.116731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.116801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.643 [2024-04-18 11:45:24.116818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.643 [2024-04-18 11:45:24.116876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.644 [2024-04-18 11:45:24.116893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.644 [2024-04-18 11:45:24.116946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.644 [2024-04-18 11:45:24.116962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.644 [2024-04-18 11:45:24.117017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.644 [2024-04-18 11:45:24.117033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.644 [2024-04-18 11:45:24.166817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.644 [2024-04-18 11:45:24.166850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.644 [2024-04-18 11:45:24.166890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.644 [2024-04-18 11:45:24.166907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.644 [2024-04-18 11:45:24.166960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.644 [2024-04-18 11:45:24.166992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.644 [2024-04-18 11:45:24.167048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.644 [2024-04-18 11:45:24.167064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.644 [2024-04-18 11:45:24.167117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.644 [2024-04-18 11:45:24.167133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.644 #35 NEW cov: 11925 ft: 13348 corp: 11/1001b lim: 100 exec/s: 35 rss: 231Mb L: 100/100 MS: 1 ChangeBinInt- 00:09:33.904 [2024-04-18 11:45:24.214258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.904 [2024-04-18 11:45:24.214294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.214345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.904 [2024-04-18 11:45:24.214363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.214420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.904 [2024-04-18 11:45:24.214437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.214490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.904 [2024-04-18 11:45:24.214506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.214558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.904 [2024-04-18 11:45:24.214574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.264439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.904 [2024-04-18 11:45:24.264471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.264529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.904 [2024-04-18 11:45:24.264545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.264601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.904 [2024-04-18 11:45:24.264617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.264669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.904 [2024-04-18 11:45:24.264685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.264739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.904 [2024-04-18 11:45:24.264755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.904 #37 NEW cov: 11925 ft: 13386 corp: 12/1101b lim: 100 exec/s: 37 rss: 232Mb L: 100/100 MS: 1 CopyPart- 00:09:33.904 [2024-04-18 11:45:24.311513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.904 [2024-04-18 11:45:24.311547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.311606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.904 [2024-04-18 11:45:24.311624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.311681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.904 [2024-04-18 11:45:24.311698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.311753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.904 [2024-04-18 11:45:24.311769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.311823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.904 [2024-04-18 11:45:24.311840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.351611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.904 [2024-04-18 11:45:24.351644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.351688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.904 [2024-04-18 11:45:24.351706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.351762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.904 [2024-04-18 11:45:24.351779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.351833] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.904 [2024-04-18 11:45:24.351849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.351906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.904 [2024-04-18 11:45:24.351923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.904 #39 NEW cov: 11925 ft: 13423 corp: 13/1201b lim: 100 exec/s: 39 rss: 233Mb L: 100/100 MS: 1 CMP- DE: "\007\000\000\000"- 00:09:33.904 [2024-04-18 11:45:24.394390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.904 [2024-04-18 11:45:24.394429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.394481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.904 [2024-04-18 11:45:24.394499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.394553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.904 [2024-04-18 11:45:24.394569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.904 [2024-04-18 11:45:24.394628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.905 [2024-04-18 11:45:24.394644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.905 [2024-04-18 11:45:24.394702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.905 [2024-04-18 11:45:24.394719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:33.905 [2024-04-18 11:45:24.434508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:33.905 [2024-04-18 11:45:24.434541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:33.905 [2024-04-18 11:45:24.434609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:33.905 [2024-04-18 11:45:24.434627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:33.905 [2024-04-18 11:45:24.434685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:33.905 [2024-04-18 11:45:24.434727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:33.905 [2024-04-18 11:45:24.434784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:33.905 [2024-04-18 11:45:24.434801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:33.905 [2024-04-18 11:45:24.434860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:33.905 [2024-04-18 11:45:24.434878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.164 #41 NEW cov: 11925 ft: 13494 corp: 14/1301b lim: 100 exec/s: 41 rss: 235Mb L: 100/100 MS: 1 CMP- DE: "nvmf"- 00:09:34.164 [2024-04-18 11:45:24.482100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.164 [2024-04-18 11:45:24.482136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.164 [2024-04-18 11:45:24.482193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.164 [2024-04-18 11:45:24.482213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.164 [2024-04-18 11:45:24.482273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.164 [2024-04-18 11:45:24.482290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.164 [2024-04-18 11:45:24.482343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.164 [2024-04-18 11:45:24.482361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.164 [2024-04-18 11:45:24.482424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.165 [2024-04-18 11:45:24.482442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.522164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.165 [2024-04-18 11:45:24.522202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.522246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.165 [2024-04-18 11:45:24.522264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.522321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.165 [2024-04-18 11:45:24.522339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.522392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.165 [2024-04-18 11:45:24.522409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.522473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.165 [2024-04-18 11:45:24.522490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.165 #43 NEW cov: 11925 ft: 13506 corp: 15/1401b lim: 100 exec/s: 43 rss: 235Mb L: 100/100 MS: 1 ChangeBinInt- 00:09:34.165 [2024-04-18 11:45:24.577885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.165 [2024-04-18 11:45:24.577923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.577979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.165 [2024-04-18 11:45:24.577996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.578050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.165 [2024-04-18 11:45:24.578069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.617954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.165 [2024-04-18 11:45:24.617987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.618061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.165 [2024-04-18 11:45:24.618078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.618134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.165 [2024-04-18 11:45:24.618151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.165 #45 NEW cov: 11925 ft: 13883 corp: 16/1462b lim: 100 exec/s: 45 rss: 236Mb L: 61/100 MS: 1 EraseBytes- 00:09:34.165 [2024-04-18 11:45:24.660947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.165 [2024-04-18 11:45:24.660980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.661032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.165 [2024-04-18 11:45:24.661049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.661103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.165 [2024-04-18 11:45:24.661121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.661178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.165 [2024-04-18 11:45:24.661194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.661249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.165 [2024-04-18 11:45:24.661266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.711071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.165 [2024-04-18 11:45:24.711105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.711183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.165 [2024-04-18 11:45:24.711201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.711257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.165 [2024-04-18 11:45:24.711275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.711330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.165 [2024-04-18 11:45:24.711349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.165 [2024-04-18 11:45:24.711405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.165 [2024-04-18 11:45:24.711427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.425 #47 NEW cov: 11925 ft: 13898 corp: 17/1562b lim: 100 exec/s: 47 rss: 237Mb L: 100/100 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:09:34.425 [2024-04-18 11:45:24.759142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.425 [2024-04-18 11:45:24.759177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.759229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.425 [2024-04-18 11:45:24.759246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.759300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.425 [2024-04-18 11:45:24.759317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.759369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.425 [2024-04-18 11:45:24.759385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.759462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.425 [2024-04-18 11:45:24.759479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.809270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.425 [2024-04-18 11:45:24.809302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.809347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.425 [2024-04-18 11:45:24.809365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.809422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.425 [2024-04-18 11:45:24.809455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.809510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.425 [2024-04-18 11:45:24.809527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.809581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.425 [2024-04-18 11:45:24.809597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.425 #49 NEW cov: 11925 ft: 13966 corp: 18/1662b lim: 100 exec/s: 49 rss: 240Mb L: 100/100 MS: 1 ChangeByte- 00:09:34.425 [2024-04-18 11:45:24.858144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.425 [2024-04-18 11:45:24.858178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.858225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.425 [2024-04-18 11:45:24.858242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.858294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.425 [2024-04-18 11:45:24.858311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.858365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.425 [2024-04-18 11:45:24.858387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.425 [2024-04-18 11:45:24.858447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.425 [2024-04-18 11:45:24.858464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.908322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.426 [2024-04-18 11:45:24.908354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.908406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.426 [2024-04-18 11:45:24.908429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.908499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.426 [2024-04-18 11:45:24.908524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.908580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.426 [2024-04-18 11:45:24.908597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.908652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.426 [2024-04-18 11:45:24.908669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.426 #51 NEW cov: 11925 ft: 13978 corp: 19/1762b lim: 100 exec/s: 51 rss: 241Mb L: 100/100 MS: 1 PersAutoDict- DE: "nvmf"- 00:09:34.426 [2024-04-18 11:45:24.951708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.426 [2024-04-18 11:45:24.951743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.951799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.426 [2024-04-18 11:45:24.951818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.951874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.426 [2024-04-18 11:45:24.951891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.951945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.426 [2024-04-18 11:45:24.951962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.426 [2024-04-18 11:45:24.952021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.426 [2024-04-18 11:45:24.952039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.685 [2024-04-18 11:45:25.001796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:09:34.686 [2024-04-18 11:45:25.001831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:34.686 [2024-04-18 11:45:25.001882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:09:34.686 [2024-04-18 11:45:25.001899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:34.686 [2024-04-18 11:45:25.001967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:09:34.686 [2024-04-18 11:45:25.001989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:34.686 [2024-04-18 11:45:25.002048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:09:34.686 [2024-04-18 11:45:25.002065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:34.686 [2024-04-18 11:45:25.002119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:09:34.686 [2024-04-18 11:45:25.002136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:09:34.686 #53 NEW cov: 11925 ft: 14032 corp: 20/1862b lim: 100 exec/s: 26 rss: 242Mb L: 100/100 MS: 1 ChangeBit- 00:09:34.686 #53 DONE cov: 11925 ft: 14032 corp: 20/1862b lim: 100 exec/s: 26 rss: 242Mb 00:09:34.686 ###### Recommended dictionary. ###### 00:09:34.686 "\007\000\000\000" # Uses: 1 00:09:34.686 "nvmf" # Uses: 1 00:09:34.686 ###### End of recommended dictionary. ###### 00:09:34.686 Done 53 runs in 2 second(s) 00:09:34.976 11:45:25 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:09:34.976 11:45:25 -- ../common.sh@72 -- # (( i++ )) 00:09:34.976 11:45:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:34.976 11:45:25 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:09:34.976 11:45:25 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:09:34.976 11:45:25 -- nvmf/run.sh@24 -- # local timen=1 00:09:34.976 11:45:25 -- nvmf/run.sh@25 -- # local core=0x1 00:09:34.976 11:45:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:34.976 11:45:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:09:34.976 11:45:25 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:34.976 11:45:25 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:34.976 11:45:25 -- nvmf/run.sh@34 -- # printf %02d 19 00:09:34.976 11:45:25 -- nvmf/run.sh@34 -- # port=4419 00:09:34.976 11:45:25 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:34.976 11:45:25 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:09:34.976 11:45:25 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:34.976 11:45:25 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:34.976 11:45:25 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:34.976 11:45:25 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:09:35.236 [2024-04-18 11:45:25.553763] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:35.236 [2024-04-18 11:45:25.553855] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid374399 ] 00:09:35.236 EAL: No free 2048 kB hugepages reported on node 1 00:09:35.495 [2024-04-18 11:45:25.829166] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.495 [2024-04-18 11:45:25.982339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.754 [2024-04-18 11:45:26.226599] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:35.754 [2024-04-18 11:45:26.242806] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:09:35.754 INFO: Running with entropic power schedule (0xFF, 100). 00:09:35.754 INFO: Seed: 3336437356 00:09:35.754 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:35.754 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:35.754 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:09:35.754 INFO: A corpus is not provided, starting from an empty corpus 00:09:35.754 #2 INITED exec/s: 0 rss: 200Mb 00:09:35.754 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:35.754 This may also happen if the target rejected all inputs we tried so far 00:09:35.754 [2024-04-18 11:45:26.298623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:35.754 [2024-04-18 11:45:26.298664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:35.754 [2024-04-18 11:45:26.298715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 00:09:35.754 [2024-04-18 11:45:26.298734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.273 NEW_FUNC[1/669]: 0x56d9c0 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:09:36.273 NEW_FUNC[2/669]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:36.273 [2024-04-18 11:45:26.659519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:36.273 [2024-04-18 11:45:26.659578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.659634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 00:09:36.273 [2024-04-18 11:45:26.659654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.273 #4 NEW cov: 11758 ft: 11731 corp: 2/24b lim: 50 exec/s: 0 rss: 217Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:09:36.273 [2024-04-18 11:45:26.712141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:36.273 [2024-04-18 11:45:26.712188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.712242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11502087481254191007 len:40864 00:09:36.273 [2024-04-18 11:45:26.712261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.712306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11502087481254191007 len:40864 00:09:36.273 [2024-04-18 11:45:26.712324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.273 NEW_FUNC[1/1]: 0x1ac5000 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1128 00:09:36.273 [2024-04-18 11:45:26.752218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:36.273 [2024-04-18 11:45:26.752256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.752326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11502087481254191007 len:40864 00:09:36.273 [2024-04-18 11:45:26.752345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.752397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11502087481254191007 len:40864 00:09:36.273 [2024-04-18 11:45:26.752421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.273 #10 NEW cov: 11788 ft: 12492 corp: 3/54b lim: 50 exec/s: 0 rss: 218Mb L: 30/30 MS: 5 CopyPart-CrossOver-CopyPart-EraseBytes-InsertRepeatedBytes- 00:09:36.273 [2024-04-18 11:45:26.797948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:36.273 [2024-04-18 11:45:26.797985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.798039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:58340 00:09:36.273 [2024-04-18 11:45:26.798058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.798112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:16421219234243404771 len:58340 00:09:36.273 [2024-04-18 11:45:26.798131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.273 [2024-04-18 11:45:26.798183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:16421219234243404771 len:58340 00:09:36.273 [2024-04-18 11:45:26.798201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.848022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:36.533 [2024-04-18 11:45:26.848056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.848094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:58340 00:09:36.533 [2024-04-18 11:45:26.848113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.848167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:16421219234243404771 len:58340 00:09:36.533 [2024-04-18 11:45:26.848190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.848241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:16421219234243404771 len:58340 00:09:36.533 [2024-04-18 11:45:26.848259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:36.533 #12 NEW cov: 11800 ft: 12950 corp: 4/99b lim: 50 exec/s: 0 rss: 220Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:09:36.533 [2024-04-18 11:45:26.895353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:36.533 [2024-04-18 11:45:26.895389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.895445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11502087481254191007 len:40864 00:09:36.533 [2024-04-18 11:45:26.895464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.895518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11502087481254191007 len:40864 00:09:36.533 [2024-04-18 11:45:26.895536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.945485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:36.533 [2024-04-18 11:45:26.945518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.945571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11502087481254191007 len:40864 00:09:36.533 [2024-04-18 11:45:26.945590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:26.945661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11502087481254191007 len:40864 00:09:36.533 [2024-04-18 11:45:26.945680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.533 #14 NEW cov: 11886 ft: 13354 corp: 5/129b lim: 50 exec/s: 0 rss: 222Mb L: 30/45 MS: 1 ShuffleBytes- 00:09:36.533 [2024-04-18 11:45:27.002814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:36.533 [2024-04-18 11:45:27.002851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:27.002938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11501911976005574559 len:1 00:09:36.533 [2024-04-18 11:45:27.002959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.533 [2024-04-18 11:45:27.003009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11502087478576152576 len:40864 00:09:36.533 [2024-04-18 11:45:27.003028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.533 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:36.534 [2024-04-18 11:45:27.042857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:36.534 [2024-04-18 11:45:27.042893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.534 [2024-04-18 11:45:27.042940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11501911976005574559 len:1 00:09:36.534 [2024-04-18 11:45:27.042959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.534 [2024-04-18 11:45:27.043010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:11502087478576152576 len:40864 00:09:36.534 [2024-04-18 11:45:27.043028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.534 #16 NEW cov: 11903 ft: 13510 corp: 6/167b lim: 50 exec/s: 0 rss: 223Mb L: 38/45 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:09:36.793 [2024-04-18 11:45:27.089233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:09:36.793 [2024-04-18 11:45:27.089269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.793 [2024-04-18 11:45:27.089333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:09:36.793 [2024-04-18 11:45:27.089353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.793 [2024-04-18 11:45:27.089424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:09:36.793 [2024-04-18 11:45:27.089443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.793 [2024-04-18 11:45:27.129366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:09:36.793 [2024-04-18 11:45:27.129407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.793 [2024-04-18 11:45:27.129468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:09:36.793 [2024-04-18 11:45:27.129487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.793 [2024-04-18 11:45:27.129544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:09:36.794 [2024-04-18 11:45:27.129562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.794 #20 NEW cov: 11903 ft: 13682 corp: 7/202b lim: 50 exec/s: 0 rss: 225Mb L: 35/45 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:09:36.794 [2024-04-18 11:45:27.174384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:36.794 [2024-04-18 11:45:27.174422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.174476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545499714297631 len:7968 00:09:36.794 [2024-04-18 11:45:27.174495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.214483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:36.794 [2024-04-18 11:45:27.214517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.214596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545499714297631 len:7968 00:09:36.794 [2024-04-18 11:45:27.214616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.794 #22 NEW cov: 11903 ft: 13772 corp: 8/225b lim: 50 exec/s: 0 rss: 226Mb L: 23/45 MS: 1 ChangeByte- 00:09:36.794 [2024-04-18 11:45:27.263866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:36.794 [2024-04-18 11:45:27.263901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.263945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4195729908644330042 len:7968 00:09:36.794 [2024-04-18 11:45:27.263963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.264014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4620445961231998751 len:7968 00:09:36.794 [2024-04-18 11:45:27.264032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.314010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:36.794 [2024-04-18 11:45:27.314044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.314080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4195729908644330042 len:7968 00:09:36.794 [2024-04-18 11:45:27.314098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:36.794 [2024-04-18 11:45:27.314151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4620445961231998751 len:7968 00:09:36.794 [2024-04-18 11:45:27.314181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:36.794 #24 NEW cov: 11903 ft: 13866 corp: 9/255b lim: 50 exec/s: 24 rss: 227Mb L: 30/45 MS: 1 InsertRepeatedBytes- 00:09:37.053 [2024-04-18 11:45:27.361002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:37.054 [2024-04-18 11:45:27.361039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.361089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11501911976005574559 len:11 00:09:37.054 [2024-04-18 11:45:27.361107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.361159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2242511139975929631 len:1 00:09:37.054 [2024-04-18 11:45:27.361178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.411118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:37.054 [2024-04-18 11:45:27.411151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.411203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11501911976005574559 len:11 00:09:37.054 [2024-04-18 11:45:27.411222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.411277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2242511139975929631 len:1 00:09:37.054 [2024-04-18 11:45:27.411295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.054 #26 NEW cov: 11903 ft: 13878 corp: 10/292b lim: 50 exec/s: 26 rss: 229Mb L: 37/45 MS: 1 CrossOver- 00:09:37.054 [2024-04-18 11:45:27.456556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:37.054 [2024-04-18 11:45:27.456592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.456663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545500169051935 len:7968 00:09:37.054 [2024-04-18 11:45:27.456683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.506733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:37.054 [2024-04-18 11:45:27.506767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.506816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545500169051935 len:7968 00:09:37.054 [2024-04-18 11:45:27.506834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.054 #28 NEW cov: 11903 ft: 13894 corp: 11/315b lim: 50 exec/s: 28 rss: 230Mb L: 23/45 MS: 1 EraseBytes- 00:09:37.054 [2024-04-18 11:45:27.566847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628063775 len:7968 00:09:37.054 [2024-04-18 11:45:27.566883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.054 [2024-04-18 11:45:27.566942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545499714297631 len:7968 00:09:37.054 [2024-04-18 11:45:27.566962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.606956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628063775 len:7968 00:09:37.314 [2024-04-18 11:45:27.606991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.607060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545499714297631 len:7968 00:09:37.314 [2024-04-18 11:45:27.607078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.314 #30 NEW cov: 11903 ft: 13911 corp: 12/338b lim: 50 exec/s: 30 rss: 231Mb L: 23/45 MS: 1 CopyPart- 00:09:37.314 [2024-04-18 11:45:27.654392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:37.314 [2024-04-18 11:45:27.654433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.654505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11501911976005574559 len:11 00:09:37.314 [2024-04-18 11:45:27.654524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.654578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2242511139975929631 len:1 00:09:37.314 [2024-04-18 11:45:27.654597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.704482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545907383869215 len:40864 00:09:37.314 [2024-04-18 11:45:27.704515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.704585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11501911976005574559 len:11 00:09:37.314 [2024-04-18 11:45:27.704605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.704656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2242511139975929631 len:1 00:09:37.314 [2024-04-18 11:45:27.704674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.314 #32 NEW cov: 11903 ft: 13999 corp: 13/375b lim: 50 exec/s: 32 rss: 232Mb L: 37/45 MS: 1 ShuffleBytes- 00:09:37.314 [2024-04-18 11:45:27.764076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055354 len:7968 00:09:37.314 [2024-04-18 11:45:27.764114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.764186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 00:09:37.314 [2024-04-18 11:45:27.764205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.804207] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055354 len:7968 00:09:37.314 [2024-04-18 11:45:27.804241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.804320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 00:09:37.314 [2024-04-18 11:45:27.804340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.314 #34 NEW cov: 11903 ft: 14011 corp: 14/398b lim: 50 exec/s: 34 rss: 234Mb L: 23/45 MS: 1 ChangeByte- 00:09:37.314 [2024-04-18 11:45:27.849818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:37.314 [2024-04-18 11:45:27.849852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.849888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4195729908644330042 len:7968 00:09:37.314 [2024-04-18 11:45:27.849907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.314 [2024-04-18 11:45:27.849960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8223325663133394297 len:7968 00:09:37.314 [2024-04-18 11:45:27.849978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.889973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:37.574 [2024-04-18 11:45:27.890009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.890078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4195729908644330042 len:7968 00:09:37.574 [2024-04-18 11:45:27.890098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.890152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8223325663133394297 len:7968 00:09:37.574 [2024-04-18 11:45:27.890170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.574 #36 NEW cov: 11903 ft: 14073 corp: 15/428b lim: 50 exec/s: 36 rss: 235Mb L: 30/45 MS: 1 CMP- DE: "keyr"- 00:09:37.574 [2024-04-18 11:45:27.937426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:09:37.574 [2024-04-18 11:45:27.937463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.937533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:09:37.574 [2024-04-18 11:45:27.937552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.937616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:09:37.574 [2024-04-18 11:45:27.937634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.987472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:09:37.574 [2024-04-18 11:45:27.987508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.987566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:09:37.574 [2024-04-18 11:45:27.987585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:27.987647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:09:37.574 [2024-04-18 11:45:27.987666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.574 #38 NEW cov: 11903 ft: 14090 corp: 16/464b lim: 50 exec/s: 38 rss: 236Mb L: 36/45 MS: 1 InsertByte- 00:09:37.574 [2024-04-18 11:45:28.046758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:09:37.574 [2024-04-18 11:45:28.046794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:28.046856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18302628885633695743 len:65536 00:09:37.574 [2024-04-18 11:45:28.046874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:28.046931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:09:37.574 [2024-04-18 11:45:28.046950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:28.086862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 00:09:37.574 [2024-04-18 11:45:28.086895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:28.086930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18302628885633695743 len:65536 00:09:37.574 [2024-04-18 11:45:28.086949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.574 [2024-04-18 11:45:28.087000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:09:37.574 [2024-04-18 11:45:28.087018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.574 #40 NEW cov: 11903 ft: 14099 corp: 17/499b lim: 50 exec/s: 40 rss: 238Mb L: 35/45 MS: 1 ChangeBit- 00:09:37.834 [2024-04-18 11:45:28.132744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:37.834 [2024-04-18 11:45:28.132782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.132860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545500169051935 len:7968 00:09:37.834 [2024-04-18 11:45:28.132879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.182819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545357628055327 len:7968 00:09:37.834 [2024-04-18 11:45:28.182853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.182908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2242545500169051935 len:7968 00:09:37.834 [2024-04-18 11:45:28.182926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.834 #42 NEW cov: 11903 ft: 14130 corp: 18/522b lim: 50 exec/s: 42 rss: 239Mb L: 23/45 MS: 1 ChangeByte- 00:09:37.834 [2024-04-18 11:45:28.230430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545439232433951 len:7968 00:09:37.834 [2024-04-18 11:45:28.230469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.230530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4195729908644330042 len:7968 00:09:37.834 [2024-04-18 11:45:28.230548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.230599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8223325663133394297 len:7968 00:09:37.834 [2024-04-18 11:45:28.230617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.280614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2242545439232433951 len:7968 00:09:37.834 [2024-04-18 11:45:28.280648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.280705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4195729908644330042 len:7968 00:09:37.834 [2024-04-18 11:45:28.280723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:37.834 [2024-04-18 11:45:28.280774] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8223325663133394297 len:7968 00:09:37.834 [2024-04-18 11:45:28.280791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:37.834 #44 NEW cov: 11903 ft: 14197 corp: 19/552b lim: 50 exec/s: 22 rss: 240Mb L: 30/45 MS: 1 ChangeByte- 00:09:37.834 #44 DONE cov: 11903 ft: 14197 corp: 19/552b lim: 50 exec/s: 22 rss: 240Mb 00:09:37.834 ###### Recommended dictionary. ###### 00:09:37.834 "\000\000\000\000\000\000\000\000" # Uses: 0 00:09:37.834 "keyr" # Uses: 0 00:09:37.834 ###### End of recommended dictionary. ###### 00:09:37.834 Done 44 runs in 2 second(s) 00:09:38.403 11:45:28 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:09:38.403 11:45:28 -- ../common.sh@72 -- # (( i++ )) 00:09:38.403 11:45:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:38.403 11:45:28 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:09:38.403 11:45:28 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:09:38.403 11:45:28 -- nvmf/run.sh@24 -- # local timen=1 00:09:38.403 11:45:28 -- nvmf/run.sh@25 -- # local core=0x1 00:09:38.403 11:45:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:38.403 11:45:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:09:38.403 11:45:28 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:38.403 11:45:28 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:38.403 11:45:28 -- nvmf/run.sh@34 -- # printf %02d 20 00:09:38.403 11:45:28 -- nvmf/run.sh@34 -- # port=4420 00:09:38.403 11:45:28 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:38.403 11:45:28 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:09:38.403 11:45:28 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:38.403 11:45:28 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:38.403 11:45:28 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:38.403 11:45:28 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:09:38.403 [2024-04-18 11:45:28.791600] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:38.403 [2024-04-18 11:45:28.791697] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid374787 ] 00:09:38.403 EAL: No free 2048 kB hugepages reported on node 1 00:09:38.663 [2024-04-18 11:45:29.045197] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:38.663 [2024-04-18 11:45:29.199493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:39.022 [2024-04-18 11:45:29.448959] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:39.022 [2024-04-18 11:45:29.465194] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:09:39.022 INFO: Running with entropic power schedule (0xFF, 100). 00:09:39.022 INFO: Seed: 2264502986 00:09:39.022 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:39.022 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:39.022 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:09:39.022 INFO: A corpus is not provided, starting from an empty corpus 00:09:39.022 #2 INITED exec/s: 0 rss: 199Mb 00:09:39.022 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:39.022 This may also happen if the target rejected all inputs we tried so far 00:09:39.022 [2024-04-18 11:45:29.542126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.022 [2024-04-18 11:45:29.542179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.539 NEW_FUNC[1/671]: 0x56f930 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:09:39.539 NEW_FUNC[2/671]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:39.539 [2024-04-18 11:45:29.903212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.539 [2024-04-18 11:45:29.903270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.539 #18 NEW cov: 11799 ft: 11785 corp: 2/20b lim: 90 exec/s: 0 rss: 216Mb L: 19/19 MS: 5 InsertByte-ChangeBit-CMP-CrossOver-CMP- DE: "\377\377\377\377\377\377\377\006"-"\000\000\000\000\000\000\000\000"- 00:09:39.539 [2024-04-18 11:45:29.981519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.539 [2024-04-18 11:45:29.981565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.539 NEW_FUNC[1/1]: 0x1b5a5f0 in nvme_tcp_ctrlr_connect_qpair_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:2382 00:09:39.539 [2024-04-18 11:45:30.041894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.539 [2024-04-18 11:45:30.041939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.539 #20 NEW cov: 11846 ft: 12369 corp: 3/39b lim: 90 exec/s: 0 rss: 219Mb L: 19/19 MS: 1 ChangeByte- 00:09:39.797 [2024-04-18 11:45:30.115252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.797 [2024-04-18 11:45:30.115297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.797 [2024-04-18 11:45:30.165480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.797 [2024-04-18 11:45:30.165519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.797 #27 NEW cov: 11858 ft: 12546 corp: 4/58b lim: 90 exec/s: 0 rss: 220Mb L: 19/19 MS: 1 ChangeByte- 00:09:39.797 [2024-04-18 11:45:30.235659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.797 [2024-04-18 11:45:30.235705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.797 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:39.797 [2024-04-18 11:45:30.296119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:39.797 [2024-04-18 11:45:30.296156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:39.797 #32 NEW cov: 11961 ft: 12795 corp: 5/77b lim: 90 exec/s: 0 rss: 221Mb L: 19/19 MS: 4 EraseBytes-PersAutoDict-ChangeByte-CopyPart- DE: "\377\377\377\377\377\377\377\006"- 00:09:40.056 [2024-04-18 11:45:30.362869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.056 [2024-04-18 11:45:30.362902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.056 [2024-04-18 11:45:30.423203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.056 [2024-04-18 11:45:30.423236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.056 #37 NEW cov: 11961 ft: 13107 corp: 6/102b lim: 90 exec/s: 0 rss: 224Mb L: 25/25 MS: 4 EraseBytes-ShuffleBytes-CMP-CopyPart- DE: "\001\000\000\037"- 00:09:40.056 [2024-04-18 11:45:30.495965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.056 [2024-04-18 11:45:30.495998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.056 [2024-04-18 11:45:30.556360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.056 [2024-04-18 11:45:30.556394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.056 #39 NEW cov: 11961 ft: 13183 corp: 7/123b lim: 90 exec/s: 39 rss: 225Mb L: 21/25 MS: 1 EraseBytes- 00:09:40.314 [2024-04-18 11:45:30.628921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.314 [2024-04-18 11:45:30.628958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.315 [2024-04-18 11:45:30.678969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.315 [2024-04-18 11:45:30.679012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.315 #41 NEW cov: 11961 ft: 13282 corp: 8/142b lim: 90 exec/s: 41 rss: 226Mb L: 19/25 MS: 1 ChangeByte- 00:09:40.315 [2024-04-18 11:45:30.750151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.315 [2024-04-18 11:45:30.750191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.315 [2024-04-18 11:45:30.750267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:40.315 [2024-04-18 11:45:30.750292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:40.315 [2024-04-18 11:45:30.810165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.315 [2024-04-18 11:45:30.810200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.315 [2024-04-18 11:45:30.810296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:40.315 [2024-04-18 11:45:30.810316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:40.315 #43 NEW cov: 11961 ft: 14114 corp: 9/195b lim: 90 exec/s: 43 rss: 228Mb L: 53/53 MS: 1 InsertRepeatedBytes- 00:09:40.573 [2024-04-18 11:45:30.881039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.573 [2024-04-18 11:45:30.881074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.573 [2024-04-18 11:45:30.941297] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.573 [2024-04-18 11:45:30.941333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.573 #45 NEW cov: 11961 ft: 14210 corp: 10/215b lim: 90 exec/s: 45 rss: 229Mb L: 20/53 MS: 1 InsertByte- 00:09:40.573 [2024-04-18 11:45:31.013382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.573 [2024-04-18 11:45:31.013421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.573 [2024-04-18 11:45:31.013524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:40.573 [2024-04-18 11:45:31.013546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:40.573 [2024-04-18 11:45:31.063633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.573 [2024-04-18 11:45:31.063665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.573 [2024-04-18 11:45:31.063743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:40.573 [2024-04-18 11:45:31.063764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:40.573 #49 NEW cov: 11961 ft: 14243 corp: 11/256b lim: 90 exec/s: 49 rss: 230Mb L: 41/53 MS: 3 EraseBytes-CMP-InsertRepeatedBytes- DE: "\001\000"- 00:09:40.831 [2024-04-18 11:45:31.134390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.831 [2024-04-18 11:45:31.134435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.831 [2024-04-18 11:45:31.184625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.831 [2024-04-18 11:45:31.184657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.831 #51 NEW cov: 11961 ft: 14275 corp: 12/276b lim: 90 exec/s: 51 rss: 232Mb L: 20/53 MS: 1 InsertByte- 00:09:40.831 [2024-04-18 11:45:31.255166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.831 [2024-04-18 11:45:31.255197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.831 [2024-04-18 11:45:31.255289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:40.831 [2024-04-18 11:45:31.255309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:40.831 [2024-04-18 11:45:31.315607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:40.831 [2024-04-18 11:45:31.315641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:40.831 [2024-04-18 11:45:31.315747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:40.831 [2024-04-18 11:45:31.315766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:40.831 #53 NEW cov: 11961 ft: 14347 corp: 13/324b lim: 90 exec/s: 53 rss: 233Mb L: 48/53 MS: 1 CrossOver- 00:09:41.091 [2024-04-18 11:45:31.387791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:41.091 [2024-04-18 11:45:31.387830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:41.091 [2024-04-18 11:45:31.387912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:41.091 [2024-04-18 11:45:31.387935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:41.091 [2024-04-18 11:45:31.448191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:41.091 [2024-04-18 11:45:31.448226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:41.091 [2024-04-18 11:45:31.448331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:41.091 [2024-04-18 11:45:31.448352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:41.091 #55 NEW cov: 11961 ft: 14444 corp: 14/369b lim: 90 exec/s: 55 rss: 234Mb L: 45/53 MS: 1 EraseBytes- 00:09:41.091 [2024-04-18 11:45:31.512249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:41.091 [2024-04-18 11:45:31.512286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:41.091 [2024-04-18 11:45:31.512358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:41.091 [2024-04-18 11:45:31.512379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:41.091 [2024-04-18 11:45:31.572760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:09:41.091 [2024-04-18 11:45:31.572796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:41.091 [2024-04-18 11:45:31.572880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:09:41.091 [2024-04-18 11:45:31.572899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:41.091 #62 NEW cov: 11961 ft: 14563 corp: 15/415b lim: 90 exec/s: 31 rss: 236Mb L: 46/53 MS: 1 InsertByte- 00:09:41.091 #62 DONE cov: 11961 ft: 14563 corp: 15/415b lim: 90 exec/s: 31 rss: 236Mb 00:09:41.091 ###### Recommended dictionary. ###### 00:09:41.091 "\377\377\377\377\377\377\377\006" # Uses: 1 00:09:41.091 "\000\000\000\000\000\000\000\000" # Uses: 0 00:09:41.091 "\001\000\000\037" # Uses: 0 00:09:41.091 "\001\000" # Uses: 0 00:09:41.091 ###### End of recommended dictionary. ###### 00:09:41.091 Done 62 runs in 2 second(s) 00:09:41.660 11:45:32 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:09:41.660 11:45:32 -- ../common.sh@72 -- # (( i++ )) 00:09:41.660 11:45:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:41.660 11:45:32 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:09:41.660 11:45:32 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:09:41.660 11:45:32 -- nvmf/run.sh@24 -- # local timen=1 00:09:41.660 11:45:32 -- nvmf/run.sh@25 -- # local core=0x1 00:09:41.660 11:45:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:41.660 11:45:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:09:41.660 11:45:32 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:41.660 11:45:32 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:41.660 11:45:32 -- nvmf/run.sh@34 -- # printf %02d 21 00:09:41.660 11:45:32 -- nvmf/run.sh@34 -- # port=4421 00:09:41.660 11:45:32 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:41.660 11:45:32 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:09:41.661 11:45:32 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:41.661 11:45:32 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:41.661 11:45:32 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:41.661 11:45:32 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:09:41.661 [2024-04-18 11:45:32.106744] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:41.661 [2024-04-18 11:45:32.106839] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid375275 ] 00:09:41.661 EAL: No free 2048 kB hugepages reported on node 1 00:09:41.921 [2024-04-18 11:45:32.376271] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:42.180 [2024-04-18 11:45:32.529877] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:42.439 [2024-04-18 11:45:32.775982] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:42.439 [2024-04-18 11:45:32.792205] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:09:42.439 INFO: Running with entropic power schedule (0xFF, 100). 00:09:42.439 INFO: Seed: 1296510995 00:09:42.439 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:42.440 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:42.440 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:09:42.440 INFO: A corpus is not provided, starting from an empty corpus 00:09:42.440 #2 INITED exec/s: 0 rss: 200Mb 00:09:42.440 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:42.440 This may also happen if the target rejected all inputs we tried so far 00:09:42.440 [2024-04-18 11:45:32.847966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.440 [2024-04-18 11:45:32.848002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.699 NEW_FUNC[1/672]: 0x573220 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:09:42.699 NEW_FUNC[2/672]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:42.699 [2024-04-18 11:45:33.190361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.699 [2024-04-18 11:45:33.190424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.699 #8 NEW cov: 11797 ft: 11770 corp: 2/11b lim: 50 exec/s: 0 rss: 216Mb L: 10/10 MS: 5 CrossOver-CopyPart-ChangeBit-CrossOver-CMP- DE: "\251l-\356\363\375\004\000"- 00:09:42.699 [2024-04-18 11:45:33.238811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.699 [2024-04-18 11:45:33.238853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.699 [2024-04-18 11:45:33.238922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:42.699 [2024-04-18 11:45:33.238945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:42.699 [2024-04-18 11:45:33.238999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:42.699 [2024-04-18 11:45:33.239017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:42.958 [2024-04-18 11:45:33.278916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.958 [2024-04-18 11:45:33.278957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.958 [2024-04-18 11:45:33.279013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:42.958 [2024-04-18 11:45:33.279031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:42.958 [2024-04-18 11:45:33.279085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:42.958 [2024-04-18 11:45:33.279103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:42.958 #13 NEW cov: 11821 ft: 12840 corp: 3/50b lim: 50 exec/s: 0 rss: 219Mb L: 39/39 MS: 4 CMP-InsertByte-ChangeBinInt-InsertRepeatedBytes- DE: "\015\000\000\000"- 00:09:42.958 [2024-04-18 11:45:33.320082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.959 [2024-04-18 11:45:33.320116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.959 [2024-04-18 11:45:33.370221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.959 [2024-04-18 11:45:33.370254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.959 #15 NEW cov: 11833 ft: 13294 corp: 4/61b lim: 50 exec/s: 0 rss: 220Mb L: 11/39 MS: 1 CrossOver- 00:09:42.959 [2024-04-18 11:45:33.411325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.959 [2024-04-18 11:45:33.411361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.959 [2024-04-18 11:45:33.461436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:42.959 [2024-04-18 11:45:33.461470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:42.959 #18 NEW cov: 11919 ft: 13515 corp: 5/74b lim: 50 exec/s: 0 rss: 221Mb L: 13/39 MS: 2 EraseBytes-CMP- DE: "nvmf"- 00:09:43.218 [2024-04-18 11:45:33.519394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.218 [2024-04-18 11:45:33.519436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.218 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:43.218 [2024-04-18 11:45:33.559481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.218 [2024-04-18 11:45:33.559515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.218 #20 NEW cov: 11936 ft: 13665 corp: 6/85b lim: 50 exec/s: 0 rss: 223Mb L: 11/39 MS: 1 CopyPart- 00:09:43.218 [2024-04-18 11:45:33.601020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.218 [2024-04-18 11:45:33.601055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.218 [2024-04-18 11:45:33.651103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.218 [2024-04-18 11:45:33.651137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.218 #22 NEW cov: 11936 ft: 13740 corp: 7/98b lim: 50 exec/s: 0 rss: 225Mb L: 13/39 MS: 1 CopyPart- 00:09:43.218 [2024-04-18 11:45:33.692656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.218 [2024-04-18 11:45:33.692706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.218 [2024-04-18 11:45:33.692767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:43.218 [2024-04-18 11:45:33.692786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:43.218 [2024-04-18 11:45:33.742758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.218 [2024-04-18 11:45:33.742791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.218 [2024-04-18 11:45:33.742849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:43.218 [2024-04-18 11:45:33.742866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:43.218 #24 NEW cov: 11936 ft: 14022 corp: 8/119b lim: 50 exec/s: 0 rss: 227Mb L: 21/39 MS: 1 CrossOver- 00:09:43.478 [2024-04-18 11:45:33.784585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.478 [2024-04-18 11:45:33.784618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.478 [2024-04-18 11:45:33.824660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.478 [2024-04-18 11:45:33.824694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.478 #26 NEW cov: 11936 ft: 14056 corp: 9/132b lim: 50 exec/s: 26 rss: 227Mb L: 13/39 MS: 1 ShuffleBytes- 00:09:43.478 [2024-04-18 11:45:33.866432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.478 [2024-04-18 11:45:33.866466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.478 [2024-04-18 11:45:33.906456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.478 [2024-04-18 11:45:33.906488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.478 #28 NEW cov: 11936 ft: 14214 corp: 10/142b lim: 50 exec/s: 28 rss: 228Mb L: 10/39 MS: 1 ChangeBinInt- 00:09:43.478 [2024-04-18 11:45:33.948057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.478 [2024-04-18 11:45:33.948090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.478 [2024-04-18 11:45:33.998182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.478 [2024-04-18 11:45:33.998213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.478 #30 NEW cov: 11936 ft: 14237 corp: 11/160b lim: 50 exec/s: 30 rss: 230Mb L: 18/39 MS: 1 CopyPart- 00:09:43.738 [2024-04-18 11:45:34.043571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.738 [2024-04-18 11:45:34.043605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.738 [2024-04-18 11:45:34.083615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.738 [2024-04-18 11:45:34.083648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.738 #32 NEW cov: 11936 ft: 14283 corp: 12/174b lim: 50 exec/s: 32 rss: 231Mb L: 14/39 MS: 1 InsertByte- 00:09:43.738 [2024-04-18 11:45:34.125424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.738 [2024-04-18 11:45:34.125458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.738 [2024-04-18 11:45:34.175569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.738 [2024-04-18 11:45:34.175603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.738 #34 NEW cov: 11936 ft: 14295 corp: 13/192b lim: 50 exec/s: 34 rss: 233Mb L: 18/39 MS: 1 ChangeBinInt- 00:09:43.738 [2024-04-18 11:45:34.217588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.738 [2024-04-18 11:45:34.217621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.738 [2024-04-18 11:45:34.267728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.738 [2024-04-18 11:45:34.267759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.738 #36 NEW cov: 11936 ft: 14305 corp: 14/206b lim: 50 exec/s: 36 rss: 234Mb L: 14/39 MS: 1 InsertByte- 00:09:43.997 [2024-04-18 11:45:34.309923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.997 [2024-04-18 11:45:34.309958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.360089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.997 [2024-04-18 11:45:34.360122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.997 #38 NEW cov: 11936 ft: 14325 corp: 15/221b lim: 50 exec/s: 38 rss: 236Mb L: 15/39 MS: 1 CrossOver- 00:09:43.997 [2024-04-18 11:45:34.402445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.997 [2024-04-18 11:45:34.402480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.402521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:43.997 [2024-04-18 11:45:34.402539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.402597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:43.997 [2024-04-18 11:45:34.402615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.452562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.997 [2024-04-18 11:45:34.452595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.452646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:43.997 [2024-04-18 11:45:34.452664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.452718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:43.997 [2024-04-18 11:45:34.452735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:43.997 #40 NEW cov: 11936 ft: 14371 corp: 16/260b lim: 50 exec/s: 40 rss: 237Mb L: 39/39 MS: 1 ChangeBinInt- 00:09:43.997 [2024-04-18 11:45:34.495403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.997 [2024-04-18 11:45:34.495444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.495505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:43.997 [2024-04-18 11:45:34.495527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.495590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:43.997 [2024-04-18 11:45:34.495609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.495661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:43.997 [2024-04-18 11:45:34.495678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.545552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:43.997 [2024-04-18 11:45:34.545586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.545626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:09:43.997 [2024-04-18 11:45:34.545644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.545697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:09:43.997 [2024-04-18 11:45:34.545715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:43.997 [2024-04-18 11:45:34.545774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:09:43.998 [2024-04-18 11:45:34.545792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:44.257 #42 NEW cov: 11936 ft: 14774 corp: 17/304b lim: 50 exec/s: 42 rss: 238Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:09:44.257 [2024-04-18 11:45:34.593761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:44.257 [2024-04-18 11:45:34.593797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:44.257 [2024-04-18 11:45:34.643854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:44.257 [2024-04-18 11:45:34.643889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:44.257 #44 NEW cov: 11936 ft: 14791 corp: 18/315b lim: 50 exec/s: 44 rss: 240Mb L: 11/44 MS: 1 InsertByte- 00:09:44.257 [2024-04-18 11:45:34.686352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:44.257 [2024-04-18 11:45:34.686387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:44.257 [2024-04-18 11:45:34.726408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:44.257 [2024-04-18 11:45:34.726448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:44.257 #46 NEW cov: 11936 ft: 14858 corp: 19/333b lim: 50 exec/s: 46 rss: 241Mb L: 18/44 MS: 1 CopyPart- 00:09:44.257 [2024-04-18 11:45:34.768859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:44.257 [2024-04-18 11:45:34.768894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:44.517 [2024-04-18 11:45:34.818995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:09:44.517 [2024-04-18 11:45:34.819029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:44.517 #48 NEW cov: 11936 ft: 14885 corp: 20/344b lim: 50 exec/s: 24 rss: 242Mb L: 11/44 MS: 1 ChangeByte- 00:09:44.517 #48 DONE cov: 11936 ft: 14885 corp: 20/344b lim: 50 exec/s: 24 rss: 242Mb 00:09:44.517 ###### Recommended dictionary. ###### 00:09:44.517 "\251l-\356\363\375\004\000" # Uses: 0 00:09:44.517 "\015\000\000\000" # Uses: 0 00:09:44.517 "nvmf" # Uses: 0 00:09:44.517 ###### End of recommended dictionary. ###### 00:09:44.517 Done 48 runs in 2 second(s) 00:09:44.776 11:45:35 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:09:44.776 11:45:35 -- ../common.sh@72 -- # (( i++ )) 00:09:44.776 11:45:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:44.776 11:45:35 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:09:44.776 11:45:35 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:09:44.776 11:45:35 -- nvmf/run.sh@24 -- # local timen=1 00:09:44.776 11:45:35 -- nvmf/run.sh@25 -- # local core=0x1 00:09:44.776 11:45:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:44.776 11:45:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:09:44.776 11:45:35 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:44.776 11:45:35 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:44.776 11:45:35 -- nvmf/run.sh@34 -- # printf %02d 22 00:09:44.776 11:45:35 -- nvmf/run.sh@34 -- # port=4422 00:09:44.776 11:45:35 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:44.776 11:45:35 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:09:44.776 11:45:35 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:44.776 11:45:35 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:44.776 11:45:35 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:44.776 11:45:35 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:09:45.035 [2024-04-18 11:45:35.327700] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:45.035 [2024-04-18 11:45:35.327812] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid375711 ] 00:09:45.035 EAL: No free 2048 kB hugepages reported on node 1 00:09:45.295 [2024-04-18 11:45:35.590873] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:45.295 [2024-04-18 11:45:35.742926] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.554 [2024-04-18 11:45:35.990892] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:45.554 [2024-04-18 11:45:36.007113] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:09:45.554 INFO: Running with entropic power schedule (0xFF, 100). 00:09:45.554 INFO: Seed: 214567443 00:09:45.554 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:45.554 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:45.554 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:09:45.554 INFO: A corpus is not provided, starting from an empty corpus 00:09:45.554 #2 INITED exec/s: 0 rss: 200Mb 00:09:45.554 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:45.554 This may also happen if the target rejected all inputs we tried so far 00:09:45.554 [2024-04-18 11:45:36.087937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:45.554 [2024-04-18 11:45:36.087987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.072 NEW_FUNC[1/672]: 0x575970 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:09:46.072 NEW_FUNC[2/672]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:46.072 [2024-04-18 11:45:36.458436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.072 [2024-04-18 11:45:36.458488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.072 #7 NEW cov: 11823 ft: 11787 corp: 2/19b lim: 85 exec/s: 0 rss: 217Mb L: 18/18 MS: 4 ShuffleBytes-ChangeByte-CrossOver-InsertRepeatedBytes- 00:09:46.072 [2024-04-18 11:45:36.538323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.072 [2024-04-18 11:45:36.538365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.072 [2024-04-18 11:45:36.598722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.072 [2024-04-18 11:45:36.598756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.330 #15 NEW cov: 11847 ft: 12122 corp: 3/46b lim: 85 exec/s: 0 rss: 218Mb L: 27/27 MS: 2 CrossOver-CrossOver- 00:09:46.330 [2024-04-18 11:45:36.658860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.330 [2024-04-18 11:45:36.658895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.330 [2024-04-18 11:45:36.719062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.330 [2024-04-18 11:45:36.719093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.330 #17 NEW cov: 11859 ft: 12629 corp: 4/73b lim: 85 exec/s: 0 rss: 221Mb L: 27/27 MS: 1 ChangeBinInt- 00:09:46.330 [2024-04-18 11:45:36.792499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.330 [2024-04-18 11:45:36.792538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.330 [2024-04-18 11:45:36.792633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:46.330 [2024-04-18 11:45:36.792654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:46.330 [2024-04-18 11:45:36.852871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.330 [2024-04-18 11:45:36.852906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.330 [2024-04-18 11:45:36.852998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:46.330 [2024-04-18 11:45:36.853020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:46.590 #19 NEW cov: 11945 ft: 13738 corp: 5/107b lim: 85 exec/s: 0 rss: 222Mb L: 34/34 MS: 1 CopyPart- 00:09:46.590 [2024-04-18 11:45:36.925327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.590 [2024-04-18 11:45:36.925364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.590 [2024-04-18 11:45:36.975743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.590 [2024-04-18 11:45:36.975779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.590 #21 NEW cov: 11945 ft: 13883 corp: 6/134b lim: 85 exec/s: 0 rss: 223Mb L: 27/34 MS: 1 CopyPart- 00:09:46.590 [2024-04-18 11:45:37.040874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.590 [2024-04-18 11:45:37.040909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.590 [2024-04-18 11:45:37.041006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:46.590 [2024-04-18 11:45:37.041025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:46.590 [2024-04-18 11:45:37.101289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.590 [2024-04-18 11:45:37.101323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.590 [2024-04-18 11:45:37.101426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:46.590 [2024-04-18 11:45:37.101445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:46.590 #23 NEW cov: 11945 ft: 13948 corp: 7/176b lim: 85 exec/s: 23 rss: 224Mb L: 42/42 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:09:46.849 [2024-04-18 11:45:37.160970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.849 [2024-04-18 11:45:37.161007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.211309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.849 [2024-04-18 11:45:37.211342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.849 #25 NEW cov: 11945 ft: 13963 corp: 8/194b lim: 85 exec/s: 25 rss: 227Mb L: 18/42 MS: 1 ShuffleBytes- 00:09:46.849 [2024-04-18 11:45:37.273363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.849 [2024-04-18 11:45:37.273397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.273471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:46.849 [2024-04-18 11:45:37.273494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.273546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:46.849 [2024-04-18 11:45:37.273576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.273663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:46.849 [2024-04-18 11:45:37.273686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.333563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:46.849 [2024-04-18 11:45:37.333597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.333664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:46.849 [2024-04-18 11:45:37.333686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.333742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:46.849 [2024-04-18 11:45:37.333765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:46.849 [2024-04-18 11:45:37.333849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:09:46.849 [2024-04-18 11:45:37.333869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:46.849 #27 NEW cov: 11945 ft: 14561 corp: 9/270b lim: 85 exec/s: 27 rss: 227Mb L: 76/76 MS: 1 InsertRepeatedBytes- 00:09:47.108 [2024-04-18 11:45:37.408458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.108 [2024-04-18 11:45:37.408495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.108 [2024-04-18 11:45:37.408582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.108 [2024-04-18 11:45:37.408601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.108 [2024-04-18 11:45:37.468786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.108 [2024-04-18 11:45:37.468822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.108 [2024-04-18 11:45:37.468912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.108 [2024-04-18 11:45:37.468933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.108 #29 NEW cov: 11945 ft: 14574 corp: 10/313b lim: 85 exec/s: 29 rss: 229Mb L: 43/76 MS: 1 InsertByte- 00:09:47.108 [2024-04-18 11:45:37.541468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.108 [2024-04-18 11:45:37.541505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.108 [2024-04-18 11:45:37.591776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.108 [2024-04-18 11:45:37.591811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.108 #31 NEW cov: 11945 ft: 14643 corp: 11/340b lim: 85 exec/s: 31 rss: 230Mb L: 27/76 MS: 1 ChangeBit- 00:09:47.368 [2024-04-18 11:45:37.662730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.368 [2024-04-18 11:45:37.662769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.662854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.368 [2024-04-18 11:45:37.662879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.662969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:47.368 [2024-04-18 11:45:37.662993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.713061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.368 [2024-04-18 11:45:37.713092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.713171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.368 [2024-04-18 11:45:37.713191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.713275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:47.368 [2024-04-18 11:45:37.713294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:47.368 #33 NEW cov: 11945 ft: 14920 corp: 12/407b lim: 85 exec/s: 33 rss: 231Mb L: 67/76 MS: 1 InsertRepeatedBytes- 00:09:47.368 [2024-04-18 11:45:37.784573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.368 [2024-04-18 11:45:37.784609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.784708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.368 [2024-04-18 11:45:37.784728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.368 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:47.368 [2024-04-18 11:45:37.844760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.368 [2024-04-18 11:45:37.844794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.844883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.368 [2024-04-18 11:45:37.844902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.368 #35 NEW cov: 11962 ft: 15029 corp: 13/450b lim: 85 exec/s: 35 rss: 233Mb L: 43/76 MS: 1 ChangeByte- 00:09:47.368 [2024-04-18 11:45:37.916320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.368 [2024-04-18 11:45:37.916363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.916475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.368 [2024-04-18 11:45:37.916502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.368 [2024-04-18 11:45:37.916602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:47.368 [2024-04-18 11:45:37.916627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:37.966520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.628 [2024-04-18 11:45:37.966551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:37.966618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.628 [2024-04-18 11:45:37.966638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:37.966689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:47.628 [2024-04-18 11:45:37.966708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:47.628 #37 NEW cov: 11962 ft: 15064 corp: 14/503b lim: 85 exec/s: 37 rss: 233Mb L: 53/76 MS: 1 InsertRepeatedBytes- 00:09:47.628 [2024-04-18 11:45:38.027571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.628 [2024-04-18 11:45:38.027609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:38.027698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.628 [2024-04-18 11:45:38.027720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:38.027810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:47.628 [2024-04-18 11:45:38.027829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:38.087895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:09:47.628 [2024-04-18 11:45:38.087927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:38.088001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:09:47.628 [2024-04-18 11:45:38.088022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:47.628 [2024-04-18 11:45:38.088105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:09:47.628 [2024-04-18 11:45:38.088128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:47.628 #39 NEW cov: 11962 ft: 15080 corp: 15/570b lim: 85 exec/s: 19 rss: 236Mb L: 67/76 MS: 1 ChangeByte- 00:09:47.628 #39 DONE cov: 11962 ft: 15080 corp: 15/570b lim: 85 exec/s: 19 rss: 236Mb 00:09:47.628 ###### Recommended dictionary. ###### 00:09:47.628 "\001\000\000\000\000\000\000\000" # Uses: 0 00:09:47.628 ###### End of recommended dictionary. ###### 00:09:47.628 Done 39 runs in 2 second(s) 00:09:48.196 11:45:38 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:09:48.196 11:45:38 -- ../common.sh@72 -- # (( i++ )) 00:09:48.196 11:45:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:48.196 11:45:38 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:09:48.196 11:45:38 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:09:48.196 11:45:38 -- nvmf/run.sh@24 -- # local timen=1 00:09:48.196 11:45:38 -- nvmf/run.sh@25 -- # local core=0x1 00:09:48.196 11:45:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:48.196 11:45:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:09:48.196 11:45:38 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:48.196 11:45:38 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:48.196 11:45:38 -- nvmf/run.sh@34 -- # printf %02d 23 00:09:48.196 11:45:38 -- nvmf/run.sh@34 -- # port=4423 00:09:48.196 11:45:38 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:48.196 11:45:38 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:09:48.196 11:45:38 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:48.196 11:45:38 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:48.196 11:45:38 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:48.196 11:45:38 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:09:48.196 [2024-04-18 11:45:38.599522] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:48.196 [2024-04-18 11:45:38.599634] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid376094 ] 00:09:48.196 EAL: No free 2048 kB hugepages reported on node 1 00:09:48.455 [2024-04-18 11:45:38.874190] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.714 [2024-04-18 11:45:39.030597] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:48.973 [2024-04-18 11:45:39.275289] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:48.973 [2024-04-18 11:45:39.291519] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:09:48.973 INFO: Running with entropic power schedule (0xFF, 100). 00:09:48.973 INFO: Seed: 3501536091 00:09:48.973 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:48.973 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:48.973 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:09:48.973 INFO: A corpus is not provided, starting from an empty corpus 00:09:48.973 #2 INITED exec/s: 0 rss: 199Mb 00:09:48.973 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:48.973 This may also happen if the target rejected all inputs we tried so far 00:09:48.973 [2024-04-18 11:45:39.347182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:48.973 [2024-04-18 11:45:39.347229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:48.973 [2024-04-18 11:45:39.347275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:48.973 [2024-04-18 11:45:39.347298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:48.973 [2024-04-18 11:45:39.347340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:48.973 [2024-04-18 11:45:39.347362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:49.232 NEW_FUNC[1/670]: 0x579270 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:09:49.232 NEW_FUNC[2/670]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:49.232 [2024-04-18 11:45:39.718073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.232 [2024-04-18 11:45:39.718133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.232 [2024-04-18 11:45:39.718201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:49.232 [2024-04-18 11:45:39.718224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:49.232 [2024-04-18 11:45:39.718261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:49.232 [2024-04-18 11:45:39.718282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:49.232 #7 NEW cov: 11750 ft: 11729 corp: 2/18b lim: 25 exec/s: 0 rss: 216Mb L: 17/17 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:09:49.492 [2024-04-18 11:45:39.786349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.492 [2024-04-18 11:45:39.786399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.786482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:49.492 [2024-04-18 11:45:39.786507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:49.492 NEW_FUNC[1/1]: 0x15aff90 in nvmf_tcp_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3347 00:09:49.492 [2024-04-18 11:45:39.856514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.492 [2024-04-18 11:45:39.856556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.856601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:49.492 [2024-04-18 11:45:39.856625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:49.492 #9 NEW cov: 11780 ft: 12348 corp: 3/29b lim: 25 exec/s: 0 rss: 218Mb L: 11/17 MS: 1 EraseBytes- 00:09:49.492 [2024-04-18 11:45:39.911036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.492 [2024-04-18 11:45:39.911078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.911121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:49.492 [2024-04-18 11:45:39.911145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.911187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:49.492 [2024-04-18 11:45:39.911208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.911249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:49.492 [2024-04-18 11:45:39.911270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.981255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.492 [2024-04-18 11:45:39.981296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.981339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:49.492 [2024-04-18 11:45:39.981364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.981406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:49.492 [2024-04-18 11:45:39.981451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:49.492 [2024-04-18 11:45:39.981491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:49.492 [2024-04-18 11:45:39.981515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:49.492 #11 NEW cov: 11792 ft: 13135 corp: 4/50b lim: 25 exec/s: 0 rss: 220Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:09:49.752 [2024-04-18 11:45:40.044099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.752 [2024-04-18 11:45:40.044150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.752 [2024-04-18 11:45:40.044201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:49.752 [2024-04-18 11:45:40.044228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:49.752 [2024-04-18 11:45:40.044276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:49.752 [2024-04-18 11:45:40.044300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:49.752 [2024-04-18 11:45:40.044342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:49.752 [2024-04-18 11:45:40.044366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:49.752 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:49.752 [2024-04-18 11:45:40.124057] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.752 [2024-04-18 11:45:40.124103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.752 [2024-04-18 11:45:40.124148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:49.752 [2024-04-18 11:45:40.124179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:49.752 [2024-04-18 11:45:40.124224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:49.752 [2024-04-18 11:45:40.124245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:49.752 [2024-04-18 11:45:40.124283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:49.752 [2024-04-18 11:45:40.124304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:49.752 #13 NEW cov: 11895 ft: 13498 corp: 5/72b lim: 25 exec/s: 0 rss: 222Mb L: 22/22 MS: 1 CrossOver- 00:09:49.752 [2024-04-18 11:45:40.187004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.752 [2024-04-18 11:45:40.187047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.752 [2024-04-18 11:45:40.247144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:49.752 [2024-04-18 11:45:40.247183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:49.752 #18 NEW cov: 11895 ft: 13963 corp: 6/79b lim: 25 exec/s: 0 rss: 223Mb L: 7/22 MS: 4 ShuffleBytes-CrossOver-ChangeByte-InsertRepeatedBytes- 00:09:50.011 [2024-04-18 11:45:40.309884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.011 [2024-04-18 11:45:40.309927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.309973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.011 [2024-04-18 11:45:40.309998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.310037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.011 [2024-04-18 11:45:40.310058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.310101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.011 [2024-04-18 11:45:40.310123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.380029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.011 [2024-04-18 11:45:40.380068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.380114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.011 [2024-04-18 11:45:40.380137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.380175] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.011 [2024-04-18 11:45:40.380196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.380240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.011 [2024-04-18 11:45:40.380261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.011 #20 NEW cov: 11895 ft: 14058 corp: 7/101b lim: 25 exec/s: 20 rss: 224Mb L: 22/22 MS: 1 ShuffleBytes- 00:09:50.011 [2024-04-18 11:45:40.444263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.011 [2024-04-18 11:45:40.444306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.011 [2024-04-18 11:45:40.504421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.011 [2024-04-18 11:45:40.504460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.011 #22 NEW cov: 11895 ft: 14165 corp: 8/109b lim: 25 exec/s: 22 rss: 226Mb L: 8/22 MS: 1 CrossOver- 00:09:50.270 [2024-04-18 11:45:40.565793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.270 [2024-04-18 11:45:40.565837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.565885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.270 [2024-04-18 11:45:40.565910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.625961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.270 [2024-04-18 11:45:40.626001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.626045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.270 [2024-04-18 11:45:40.626069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.270 #26 NEW cov: 11895 ft: 14213 corp: 9/119b lim: 25 exec/s: 26 rss: 227Mb L: 10/22 MS: 3 InsertByte-CopyPart-CMP- DE: "9\213\372\254\367\375\004\000"- 00:09:50.270 [2024-04-18 11:45:40.688423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.270 [2024-04-18 11:45:40.688466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.688511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.270 [2024-04-18 11:45:40.688534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.748532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.270 [2024-04-18 11:45:40.748571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.748623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.270 [2024-04-18 11:45:40.748648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.270 #31 NEW cov: 11895 ft: 14244 corp: 10/133b lim: 25 exec/s: 31 rss: 228Mb L: 14/22 MS: 4 ShuffleBytes-ChangeBit-CrossOver-CrossOver- 00:09:50.270 [2024-04-18 11:45:40.808938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.270 [2024-04-18 11:45:40.808980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.809024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.270 [2024-04-18 11:45:40.809048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.809087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.270 [2024-04-18 11:45:40.809109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.270 [2024-04-18 11:45:40.809151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.270 [2024-04-18 11:45:40.809173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:40.879039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.529 [2024-04-18 11:45:40.879080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:40.879123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.529 [2024-04-18 11:45:40.879147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:40.879186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.529 [2024-04-18 11:45:40.879208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:40.879245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.529 [2024-04-18 11:45:40.879267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.529 #33 NEW cov: 11895 ft: 14311 corp: 11/157b lim: 25 exec/s: 33 rss: 229Mb L: 24/24 MS: 1 CopyPart- 00:09:50.529 [2024-04-18 11:45:40.940946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.529 [2024-04-18 11:45:40.940987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:41.011074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.529 [2024-04-18 11:45:41.011114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.529 #35 NEW cov: 11895 ft: 14323 corp: 12/163b lim: 25 exec/s: 35 rss: 231Mb L: 6/24 MS: 1 EraseBytes- 00:09:50.529 [2024-04-18 11:45:41.071000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.529 [2024-04-18 11:45:41.071044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:41.071088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.529 [2024-04-18 11:45:41.071112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:41.071151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.529 [2024-04-18 11:45:41.071173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.529 [2024-04-18 11:45:41.071210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.529 [2024-04-18 11:45:41.071232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.787 [2024-04-18 11:45:41.131103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.787 [2024-04-18 11:45:41.131143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.787 [2024-04-18 11:45:41.131186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.787 [2024-04-18 11:45:41.131210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.787 [2024-04-18 11:45:41.131254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.787 [2024-04-18 11:45:41.131276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.787 [2024-04-18 11:45:41.131313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.788 [2024-04-18 11:45:41.131335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.788 #37 NEW cov: 11895 ft: 14401 corp: 13/185b lim: 25 exec/s: 37 rss: 232Mb L: 22/24 MS: 1 ChangeBit- 00:09:50.788 [2024-04-18 11:45:41.191053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.788 [2024-04-18 11:45:41.191095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.191139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.788 [2024-04-18 11:45:41.191163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.191203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.788 [2024-04-18 11:45:41.191224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.191262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.788 [2024-04-18 11:45:41.191283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.271195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.788 [2024-04-18 11:45:41.271234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.271277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.788 [2024-04-18 11:45:41.271301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.271340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:09:50.788 [2024-04-18 11:45:41.271362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.271400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:09:50.788 [2024-04-18 11:45:41.271427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:50.788 #39 NEW cov: 11895 ft: 14477 corp: 14/207b lim: 25 exec/s: 39 rss: 233Mb L: 22/24 MS: 1 ChangeByte- 00:09:50.788 [2024-04-18 11:45:41.322511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:50.788 [2024-04-18 11:45:41.322552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:50.788 [2024-04-18 11:45:41.322598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:50.788 [2024-04-18 11:45:41.322623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:51.047 [2024-04-18 11:45:41.392660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:09:51.047 [2024-04-18 11:45:41.392700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:51.047 [2024-04-18 11:45:41.392745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:09:51.047 [2024-04-18 11:45:41.392775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:51.047 #41 NEW cov: 11895 ft: 14517 corp: 15/217b lim: 25 exec/s: 20 rss: 235Mb L: 10/24 MS: 1 ChangeBinInt- 00:09:51.047 #41 DONE cov: 11895 ft: 14517 corp: 15/217b lim: 25 exec/s: 20 rss: 235Mb 00:09:51.047 ###### Recommended dictionary. ###### 00:09:51.047 "9\213\372\254\367\375\004\000" # Uses: 0 00:09:51.047 ###### End of recommended dictionary. ###### 00:09:51.047 Done 41 runs in 2 second(s) 00:09:51.615 11:45:41 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:09:51.615 11:45:41 -- ../common.sh@72 -- # (( i++ )) 00:09:51.615 11:45:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:51.615 11:45:41 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:09:51.615 11:45:41 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:09:51.615 11:45:41 -- nvmf/run.sh@24 -- # local timen=1 00:09:51.615 11:45:41 -- nvmf/run.sh@25 -- # local core=0x1 00:09:51.615 11:45:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:51.615 11:45:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:09:51.615 11:45:41 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:09:51.615 11:45:41 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:09:51.615 11:45:41 -- nvmf/run.sh@34 -- # printf %02d 24 00:09:51.615 11:45:41 -- nvmf/run.sh@34 -- # port=4424 00:09:51.615 11:45:41 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:51.615 11:45:41 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:09:51.615 11:45:41 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:09:51.615 11:45:41 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:51.615 11:45:41 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:09:51.615 11:45:41 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:09:51.615 [2024-04-18 11:45:41.923815] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:51.615 [2024-04-18 11:45:41.923911] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid376628 ] 00:09:51.615 EAL: No free 2048 kB hugepages reported on node 1 00:09:51.873 [2024-04-18 11:45:42.188948] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:51.873 [2024-04-18 11:45:42.342446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.132 [2024-04-18 11:45:42.586444] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:09:52.132 [2024-04-18 11:45:42.602656] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:09:52.132 INFO: Running with entropic power schedule (0xFF, 100). 00:09:52.132 INFO: Seed: 2517549973 00:09:52.132 INFO: Loaded 1 modules (351502 inline 8-bit counters): 351502 [0x346dd0c, 0x34c3a1a), 00:09:52.132 INFO: Loaded 1 PC tables (351502 PCs): 351502 [0x34c3a20,0x3a20b00), 00:09:52.132 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:09:52.132 INFO: A corpus is not provided, starting from an empty corpus 00:09:52.132 #2 INITED exec/s: 0 rss: 199Mb 00:09:52.132 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:52.132 This may also happen if the target rejected all inputs we tried so far 00:09:52.133 [2024-04-18 11:45:42.658839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.133 [2024-04-18 11:45:42.658880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.133 [2024-04-18 11:45:42.658940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.133 [2024-04-18 11:45:42.658960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.133 [2024-04-18 11:45:42.659019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.133 [2024-04-18 11:45:42.659037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.133 [2024-04-18 11:45:42.659086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.133 [2024-04-18 11:45:42.659104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.652 NEW_FUNC[1/672]: 0x57a570 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:09:52.652 NEW_FUNC[2/672]: 0x58d4c0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:09:52.652 [2024-04-18 11:45:42.999687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:42.999742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:42.999806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:42.999824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:42.999877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:42.999911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:42.999965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:42.999983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.652 #14 NEW cov: 11828 ft: 11801 corp: 2/82b lim: 100 exec/s: 0 rss: 216Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:09:52.652 [2024-04-18 11:45:43.048645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.048686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.048761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.048781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.048834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.048852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.048911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.048932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.098705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.098743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.098783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.098801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.098860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.098878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.098933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.098951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.652 #21 NEW cov: 11852 ft: 12234 corp: 3/164b lim: 100 exec/s: 0 rss: 218Mb L: 82/82 MS: 1 CrossOver- 00:09:52.652 [2024-04-18 11:45:43.140788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.140824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.140875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.140894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.140950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.140968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.141022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.141043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.180811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.180848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.180886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.180904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.180963] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.180984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.652 [2024-04-18 11:45:43.181040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.652 [2024-04-18 11:45:43.181057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.912 #23 NEW cov: 11864 ft: 12450 corp: 4/245b lim: 100 exec/s: 0 rss: 220Mb L: 81/82 MS: 1 CopyPart- 00:09:52.912 [2024-04-18 11:45:43.241745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.241781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.912 [2024-04-18 11:45:43.241822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.241841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.912 [2024-04-18 11:45:43.241898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.241916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.912 [2024-04-18 11:45:43.241970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.241988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.912 [2024-04-18 11:45:43.291860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.291896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.912 [2024-04-18 11:45:43.291949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.291969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.912 [2024-04-18 11:45:43.292025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.292046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.912 [2024-04-18 11:45:43.292102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.912 [2024-04-18 11:45:43.292126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.913 #25 NEW cov: 11950 ft: 12625 corp: 5/327b lim: 100 exec/s: 0 rss: 221Mb L: 82/82 MS: 1 CopyPart- 00:09:52.913 [2024-04-18 11:45:43.353073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.353109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.353166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.353186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.353243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.353262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.353317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.353334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.913 NEW_FUNC[1/1]: 0x1da2080 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:52.913 [2024-04-18 11:45:43.403164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.403198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.403239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.403257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.403310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.403328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.403381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.403399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:52.913 #27 NEW cov: 11967 ft: 12753 corp: 6/409b lim: 100 exec/s: 0 rss: 223Mb L: 82/82 MS: 1 ChangeBinInt- 00:09:52.913 [2024-04-18 11:45:43.445744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.445778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.445827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.445844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.445897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.445915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:52.913 [2024-04-18 11:45:43.445972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:52.913 [2024-04-18 11:45:43.445990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.485883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.485916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.485964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.485983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.486040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.486058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.486111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.486128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.172 #29 NEW cov: 11967 ft: 13060 corp: 7/490b lim: 100 exec/s: 0 rss: 224Mb L: 81/82 MS: 1 ChangeByte- 00:09:53.172 [2024-04-18 11:45:43.528379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.528417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.528458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.528476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.528532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.528550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.528604] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.528622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.578520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.578553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.578614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.578633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.578686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.578704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.578759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.578777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.172 #31 NEW cov: 11967 ft: 13212 corp: 8/571b lim: 100 exec/s: 0 rss: 226Mb L: 81/82 MS: 1 CopyPart- 00:09:53.172 [2024-04-18 11:45:43.620874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.620910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.620948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.620967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.621021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.621039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.621094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:2816 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.621112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.670988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.671022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.671067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.671085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.671138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.172 [2024-04-18 11:45:43.671155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.172 [2024-04-18 11:45:43.671215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:2816 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.173 [2024-04-18 11:45:43.671232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.173 #33 NEW cov: 11967 ft: 13248 corp: 9/670b lim: 100 exec/s: 33 rss: 227Mb L: 99/99 MS: 1 CopyPart- 00:09:53.173 [2024-04-18 11:45:43.716999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.173 [2024-04-18 11:45:43.717034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.173 [2024-04-18 11:45:43.717080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744065119617023 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.173 [2024-04-18 11:45:43.717098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.173 [2024-04-18 11:45:43.717156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.173 [2024-04-18 11:45:43.717174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.173 [2024-04-18 11:45:43.717230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.173 [2024-04-18 11:45:43.717250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.431 [2024-04-18 11:45:43.767097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.767131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.767170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744065119617023 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.767188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.767243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.767261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.767315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.767333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.432 #40 NEW cov: 11967 ft: 13319 corp: 10/751b lim: 100 exec/s: 40 rss: 229Mb L: 81/99 MS: 1 ChangeBit- 00:09:53.432 [2024-04-18 11:45:43.809975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65535 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.810011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.810069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.810088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.810144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.810163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.810219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.810238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.850117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65535 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.850152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.850192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.850210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.850263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.850282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.850341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.850359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.432 #42 NEW cov: 11967 ft: 13400 corp: 11/832b lim: 100 exec/s: 42 rss: 230Mb L: 81/99 MS: 1 ChangeBinInt- 00:09:53.432 [2024-04-18 11:45:43.896936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.896970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.897027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.897046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.897101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.897119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.897174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.897192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.937032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.937066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.937116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.937134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.937191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.937209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.432 [2024-04-18 11:45:43.937266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.432 [2024-04-18 11:45:43.937283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.432 #44 NEW cov: 11967 ft: 13516 corp: 12/926b lim: 100 exec/s: 44 rss: 231Mb L: 94/99 MS: 1 InsertRepeatedBytes- 00:09:53.692 [2024-04-18 11:45:43.984437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:43.984471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:43.984533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:43.984552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:43.984608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:43.984626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:43.984680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:43.984698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.034392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.034433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.034474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.034492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.034547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.034565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.034620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.034638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.692 #46 NEW cov: 11967 ft: 13544 corp: 13/1008b lim: 100 exec/s: 46 rss: 232Mb L: 82/99 MS: 1 InsertByte- 00:09:53.692 [2024-04-18 11:45:44.077155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.077191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.077245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65288 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.077264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.117218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.117253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.117299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65288 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.117317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.692 #53 NEW cov: 11967 ft: 14022 corp: 14/1052b lim: 100 exec/s: 53 rss: 234Mb L: 44/99 MS: 1 EraseBytes- 00:09:53.692 [2024-04-18 11:45:44.164715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65535 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.164751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.164804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.164824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.164877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.164895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.164950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.164968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.214809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65535 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.214843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.214900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.214918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.214972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.214990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.692 [2024-04-18 11:45:44.215045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.692 [2024-04-18 11:45:44.215062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.692 #55 NEW cov: 11967 ft: 14134 corp: 15/1133b lim: 100 exec/s: 55 rss: 235Mb L: 81/99 MS: 1 CMP- DE: "\000\000\000\000\000\000\004\000"- 00:09:53.952 [2024-04-18 11:45:44.261648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.261685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.261745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.261764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.261815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.261833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.261887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.261905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.301652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.301691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.301754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073693495295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.301773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.301827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.301871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.301928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.301947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.952 #57 NEW cov: 11967 ft: 14157 corp: 16/1215b lim: 100 exec/s: 57 rss: 236Mb L: 82/99 MS: 1 ShuffleBytes- 00:09:53.952 [2024-04-18 11:45:44.344879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.344913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.344952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.344970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.345023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.345041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.345098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.345116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.384953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.384988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.385034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.385052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.385107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.385142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.385203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.385221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.952 #59 NEW cov: 11967 ft: 14248 corp: 17/1297b lim: 100 exec/s: 59 rss: 238Mb L: 82/99 MS: 1 CrossOver- 00:09:53.952 [2024-04-18 11:45:44.432118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.432155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.432222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.432241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.432301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.432320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.432376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.432395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.472194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.472228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.472277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.472296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.472351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.472369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:53.952 [2024-04-18 11:45:44.472429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:53.952 [2024-04-18 11:45:44.472464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:53.952 #61 NEW cov: 11967 ft: 14311 corp: 18/1378b lim: 100 exec/s: 61 rss: 239Mb L: 81/99 MS: 1 ShuffleBytes- 00:09:54.212 [2024-04-18 11:45:44.520494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.212 [2024-04-18 11:45:44.520530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:54.212 [2024-04-18 11:45:44.520567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.212 [2024-04-18 11:45:44.520586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:54.212 [2024-04-18 11:45:44.520640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.212 [2024-04-18 11:45:44.520659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:54.212 [2024-04-18 11:45:44.520717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.520735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.560556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.560589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.560648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.560668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.560723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.560741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.560797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.560815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:54.213 #63 NEW cov: 11967 ft: 14318 corp: 19/1459b lim: 100 exec/s: 63 rss: 239Mb L: 81/99 MS: 1 ShuffleBytes- 00:09:54.213 [2024-04-18 11:45:44.603760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.603796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.603859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744065119617023 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.603877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.603933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.603952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.604007] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551370 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.604024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.653905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.653940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.653976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744065119617023 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.653995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.654051] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.654089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:09:54.213 [2024-04-18 11:45:44.654145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551370 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:09:54.213 [2024-04-18 11:45:44.654164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:09:54.213 #65 NEW cov: 11967 ft: 14341 corp: 20/1541b lim: 100 exec/s: 32 rss: 241Mb L: 82/99 MS: 1 InsertByte- 00:09:54.213 #65 DONE cov: 11967 ft: 14341 corp: 20/1541b lim: 100 exec/s: 32 rss: 241Mb 00:09:54.213 ###### Recommended dictionary. ###### 00:09:54.213 "\000\000\000\000\000\000\004\000" # Uses: 0 00:09:54.213 ###### End of recommended dictionary. ###### 00:09:54.213 Done 65 runs in 2 second(s) 00:09:54.782 11:45:45 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:09:54.782 11:45:45 -- ../common.sh@72 -- # (( i++ )) 00:09:54.782 11:45:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:54.782 11:45:45 -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:09:54.782 00:09:54.782 real 1m22.196s 00:09:54.782 user 1m42.287s 00:09:54.782 sys 0m11.895s 00:09:54.782 11:45:45 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:09:54.782 11:45:45 -- common/autotest_common.sh@10 -- # set +x 00:09:54.782 ************************************ 00:09:54.782 END TEST nvmf_fuzz 00:09:54.782 ************************************ 00:09:54.782 11:45:45 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:09:54.782 11:45:45 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:09:54.782 11:45:45 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:54.782 11:45:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:09:54.782 11:45:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:09:54.782 11:45:45 -- common/autotest_common.sh@10 -- # set +x 00:09:54.782 ************************************ 00:09:54.782 START TEST vfio_fuzz 00:09:54.782 ************************************ 00:09:54.782 11:45:45 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:09:55.044 * Looking for test storage... 00:09:55.044 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:55.044 11:45:45 -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:09:55.044 11:45:45 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:09:55.044 11:45:45 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:55.044 11:45:45 -- common/autotest_common.sh@34 -- # set -e 00:09:55.044 11:45:45 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:55.044 11:45:45 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:55.044 11:45:45 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:09:55.044 11:45:45 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:55.044 11:45:45 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:09:55.044 11:45:45 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:55.044 11:45:45 -- common/build_config.sh@2 -- # CONFIG_ASAN=y 00:09:55.044 11:45:45 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:55.044 11:45:45 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:55.044 11:45:45 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:55.044 11:45:45 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:55.045 11:45:45 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:55.045 11:45:45 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:55.045 11:45:45 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:55.045 11:45:45 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:55.045 11:45:45 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:55.045 11:45:45 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:55.045 11:45:45 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:55.045 11:45:45 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:55.045 11:45:45 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:55.045 11:45:45 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:55.045 11:45:45 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:09:55.045 11:45:45 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:55.045 11:45:45 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:55.045 11:45:45 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:09:55.045 11:45:45 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:09:55.045 11:45:45 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:09:55.045 11:45:45 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:55.045 11:45:45 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:09:55.045 11:45:45 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:09:55.045 11:45:45 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:55.045 11:45:45 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:55.045 11:45:45 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:09:55.045 11:45:45 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:09:55.045 11:45:45 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:09:55.045 11:45:45 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:09:55.045 11:45:45 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:09:55.045 11:45:45 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:09:55.045 11:45:45 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:09:55.045 11:45:45 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:09:55.045 11:45:45 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:09:55.045 11:45:45 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:09:55.045 11:45:45 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:09:55.045 11:45:45 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:09:55.045 11:45:45 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:09:55.045 11:45:45 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:09:55.045 11:45:45 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:09:55.045 11:45:45 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:09:55.045 11:45:45 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:55.045 11:45:45 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:09:55.045 11:45:45 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:09:55.045 11:45:45 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:09:55.045 11:45:45 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:55.045 11:45:45 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:09:55.045 11:45:45 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:09:55.045 11:45:45 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:09:55.045 11:45:45 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:09:55.045 11:45:45 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:09:55.045 11:45:45 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:09:55.045 11:45:45 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:09:55.045 11:45:45 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:09:55.045 11:45:45 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:09:55.045 11:45:45 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:09:55.045 11:45:45 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:09:55.045 11:45:45 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:09:55.045 11:45:45 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:09:55.045 11:45:45 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:09:55.045 11:45:45 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:09:55.045 11:45:45 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:09:55.045 11:45:45 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:09:55.045 11:45:45 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:09:55.045 11:45:45 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:09:55.045 11:45:45 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:55.045 11:45:45 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:09:55.045 11:45:45 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:09:55.045 11:45:45 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:09:55.045 11:45:45 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:09:55.045 11:45:45 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:09:55.045 11:45:45 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:09:55.045 11:45:45 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:09:55.045 11:45:45 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:09:55.045 11:45:45 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:09:55.045 11:45:45 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:09:55.045 11:45:45 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:09:55.045 11:45:45 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:55.045 11:45:45 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:09:55.045 11:45:45 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:09:55.045 11:45:45 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:55.045 11:45:45 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:09:55.045 11:45:45 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:55.045 11:45:45 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:09:55.045 11:45:45 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.045 11:45:45 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:55.045 11:45:45 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:09:55.045 11:45:45 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:55.045 11:45:45 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:55.045 11:45:45 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:55.045 11:45:45 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:55.045 11:45:45 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:55.045 11:45:45 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:55.045 11:45:45 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:55.045 11:45:45 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:09:55.045 11:45:45 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:55.045 #define SPDK_CONFIG_H 00:09:55.045 #define SPDK_CONFIG_APPS 1 00:09:55.045 #define SPDK_CONFIG_ARCH native 00:09:55.045 #define SPDK_CONFIG_ASAN 1 00:09:55.045 #undef SPDK_CONFIG_AVAHI 00:09:55.045 #undef SPDK_CONFIG_CET 00:09:55.045 #define SPDK_CONFIG_COVERAGE 1 00:09:55.045 #define SPDK_CONFIG_CROSS_PREFIX 00:09:55.045 #undef SPDK_CONFIG_CRYPTO 00:09:55.045 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:55.045 #undef SPDK_CONFIG_CUSTOMOCF 00:09:55.045 #undef SPDK_CONFIG_DAOS 00:09:55.045 #define SPDK_CONFIG_DAOS_DIR 00:09:55.045 #define SPDK_CONFIG_DEBUG 1 00:09:55.045 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:55.045 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:09:55.045 #define SPDK_CONFIG_DPDK_INC_DIR 00:09:55.045 #define SPDK_CONFIG_DPDK_LIB_DIR 00:09:55.045 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:55.045 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:09:55.045 #define SPDK_CONFIG_EXAMPLES 1 00:09:55.045 #undef SPDK_CONFIG_FC 00:09:55.045 #define SPDK_CONFIG_FC_PATH 00:09:55.045 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:55.045 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:55.045 #undef SPDK_CONFIG_FUSE 00:09:55.045 #define SPDK_CONFIG_FUZZER 1 00:09:55.045 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:09:55.045 #undef SPDK_CONFIG_GOLANG 00:09:55.045 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:55.045 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:09:55.045 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:55.045 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:09:55.045 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:55.045 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:55.045 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:55.045 #define SPDK_CONFIG_IDXD 1 00:09:55.045 #undef SPDK_CONFIG_IDXD_KERNEL 00:09:55.045 #undef SPDK_CONFIG_IPSEC_MB 00:09:55.045 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:55.045 #define SPDK_CONFIG_ISAL 1 00:09:55.045 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:55.045 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:55.045 #define SPDK_CONFIG_LIBDIR 00:09:55.045 #undef SPDK_CONFIG_LTO 00:09:55.045 #define SPDK_CONFIG_MAX_LCORES 00:09:55.045 #define SPDK_CONFIG_NVME_CUSE 1 00:09:55.045 #undef SPDK_CONFIG_OCF 00:09:55.045 #define SPDK_CONFIG_OCF_PATH 00:09:55.045 #define SPDK_CONFIG_OPENSSL_PATH 00:09:55.045 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:55.045 #define SPDK_CONFIG_PGO_DIR 00:09:55.045 #undef SPDK_CONFIG_PGO_USE 00:09:55.045 #define SPDK_CONFIG_PREFIX /usr/local 00:09:55.045 #undef SPDK_CONFIG_RAID5F 00:09:55.045 #undef SPDK_CONFIG_RBD 00:09:55.045 #define SPDK_CONFIG_RDMA 1 00:09:55.045 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:55.045 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:55.045 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:55.045 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:55.045 #undef SPDK_CONFIG_SHARED 00:09:55.045 #undef SPDK_CONFIG_SMA 00:09:55.045 #define SPDK_CONFIG_TESTS 1 00:09:55.045 #undef SPDK_CONFIG_TSAN 00:09:55.045 #define SPDK_CONFIG_UBLK 1 00:09:55.045 #define SPDK_CONFIG_UBSAN 1 00:09:55.045 #undef SPDK_CONFIG_UNIT_TESTS 00:09:55.045 #undef SPDK_CONFIG_URING 00:09:55.045 #define SPDK_CONFIG_URING_PATH 00:09:55.045 #undef SPDK_CONFIG_URING_ZNS 00:09:55.045 #undef SPDK_CONFIG_USDT 00:09:55.045 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:55.045 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:55.045 #define SPDK_CONFIG_VFIO_USER 1 00:09:55.045 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:55.045 #define SPDK_CONFIG_VHOST 1 00:09:55.045 #define SPDK_CONFIG_VIRTIO 1 00:09:55.045 #undef SPDK_CONFIG_VTUNE 00:09:55.045 #define SPDK_CONFIG_VTUNE_DIR 00:09:55.046 #define SPDK_CONFIG_WERROR 1 00:09:55.046 #define SPDK_CONFIG_WPDK_DIR 00:09:55.046 #undef SPDK_CONFIG_XNVME 00:09:55.046 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:55.046 11:45:45 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:55.046 11:45:45 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:55.046 11:45:45 -- scripts/common.sh@502 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:55.046 11:45:45 -- scripts/common.sh@510 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:55.046 11:45:45 -- scripts/common.sh@511 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:55.046 11:45:45 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.046 11:45:45 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.046 11:45:45 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.046 11:45:45 -- paths/export.sh@5 -- # export PATH 00:09:55.046 11:45:45 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.046 11:45:45 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:55.046 11:45:45 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:09:55.046 11:45:45 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:55.046 11:45:45 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:09:55.046 11:45:45 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:55.046 11:45:45 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.046 11:45:45 -- pm/common@67 -- # TEST_TAG=N/A 00:09:55.046 11:45:45 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:09:55.046 11:45:45 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:09:55.046 11:45:45 -- pm/common@71 -- # uname -s 00:09:55.046 11:45:45 -- pm/common@71 -- # PM_OS=Linux 00:09:55.046 11:45:45 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:09:55.046 11:45:45 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:09:55.046 11:45:45 -- pm/common@76 -- # [[ Linux == Linux ]] 00:09:55.046 11:45:45 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:09:55.046 11:45:45 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:09:55.046 11:45:45 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:09:55.046 11:45:45 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:09:55.046 11:45:45 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:09:55.046 11:45:45 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:09:55.046 11:45:45 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:09:55.046 11:45:45 -- common/autotest_common.sh@57 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:09:55.046 11:45:45 -- common/autotest_common.sh@61 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:55.046 11:45:45 -- common/autotest_common.sh@63 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:09:55.046 11:45:45 -- common/autotest_common.sh@65 -- # : 1 00:09:55.046 11:45:45 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:55.046 11:45:45 -- common/autotest_common.sh@67 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:09:55.046 11:45:45 -- common/autotest_common.sh@69 -- # : 00:09:55.046 11:45:45 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:09:55.046 11:45:45 -- common/autotest_common.sh@71 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:09:55.046 11:45:45 -- common/autotest_common.sh@73 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:09:55.046 11:45:45 -- common/autotest_common.sh@75 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:09:55.046 11:45:45 -- common/autotest_common.sh@77 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:55.046 11:45:45 -- common/autotest_common.sh@79 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:09:55.046 11:45:45 -- common/autotest_common.sh@81 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:09:55.046 11:45:45 -- common/autotest_common.sh@83 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:09:55.046 11:45:45 -- common/autotest_common.sh@85 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:09:55.046 11:45:45 -- common/autotest_common.sh@87 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:09:55.046 11:45:45 -- common/autotest_common.sh@89 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:09:55.046 11:45:45 -- common/autotest_common.sh@91 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:09:55.046 11:45:45 -- common/autotest_common.sh@93 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:09:55.046 11:45:45 -- common/autotest_common.sh@95 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:55.046 11:45:45 -- common/autotest_common.sh@97 -- # : 1 00:09:55.046 11:45:45 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:09:55.046 11:45:45 -- common/autotest_common.sh@99 -- # : 1 00:09:55.046 11:45:45 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:09:55.046 11:45:45 -- common/autotest_common.sh@101 -- # : rdma 00:09:55.046 11:45:45 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:55.046 11:45:45 -- common/autotest_common.sh@103 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:09:55.046 11:45:45 -- common/autotest_common.sh@105 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:09:55.046 11:45:45 -- common/autotest_common.sh@107 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:09:55.046 11:45:45 -- common/autotest_common.sh@109 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:09:55.046 11:45:45 -- common/autotest_common.sh@111 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:09:55.046 11:45:45 -- common/autotest_common.sh@113 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:09:55.046 11:45:45 -- common/autotest_common.sh@115 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:09:55.046 11:45:45 -- common/autotest_common.sh@117 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:55.046 11:45:45 -- common/autotest_common.sh@119 -- # : 1 00:09:55.046 11:45:45 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:09:55.046 11:45:45 -- common/autotest_common.sh@121 -- # : 1 00:09:55.046 11:45:45 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:09:55.046 11:45:45 -- common/autotest_common.sh@123 -- # : 00:09:55.046 11:45:45 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:55.046 11:45:45 -- common/autotest_common.sh@125 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:09:55.046 11:45:45 -- common/autotest_common.sh@127 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:09:55.046 11:45:45 -- common/autotest_common.sh@129 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:09:55.046 11:45:45 -- common/autotest_common.sh@131 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:09:55.046 11:45:45 -- common/autotest_common.sh@133 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:09:55.046 11:45:45 -- common/autotest_common.sh@135 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:09:55.046 11:45:45 -- common/autotest_common.sh@137 -- # : 00:09:55.046 11:45:45 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:09:55.046 11:45:45 -- common/autotest_common.sh@139 -- # : true 00:09:55.046 11:45:45 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:09:55.046 11:45:45 -- common/autotest_common.sh@141 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:09:55.046 11:45:45 -- common/autotest_common.sh@143 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:09:55.046 11:45:45 -- common/autotest_common.sh@145 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:09:55.046 11:45:45 -- common/autotest_common.sh@147 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:09:55.046 11:45:45 -- common/autotest_common.sh@149 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:09:55.046 11:45:45 -- common/autotest_common.sh@151 -- # : 0 00:09:55.046 11:45:45 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:09:55.046 11:45:45 -- common/autotest_common.sh@153 -- # : 00:09:55.046 11:45:45 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:09:55.046 11:45:45 -- common/autotest_common.sh@155 -- # : 0 00:09:55.047 11:45:45 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:09:55.047 11:45:45 -- common/autotest_common.sh@157 -- # : 0 00:09:55.047 11:45:45 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:09:55.047 11:45:45 -- common/autotest_common.sh@159 -- # : 0 00:09:55.047 11:45:45 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:09:55.047 11:45:45 -- common/autotest_common.sh@161 -- # : 0 00:09:55.047 11:45:45 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:09:55.047 11:45:45 -- common/autotest_common.sh@163 -- # : 0 00:09:55.047 11:45:45 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:09:55.047 11:45:45 -- common/autotest_common.sh@166 -- # : 00:09:55.047 11:45:45 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:09:55.047 11:45:45 -- common/autotest_common.sh@168 -- # : 0 00:09:55.047 11:45:45 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:09:55.047 11:45:45 -- common/autotest_common.sh@170 -- # : 0 00:09:55.047 11:45:45 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:55.047 11:45:45 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:55.047 11:45:45 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:55.047 11:45:45 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:55.047 11:45:45 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:55.047 11:45:45 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:09:55.047 11:45:45 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:55.047 11:45:45 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:09:55.047 11:45:45 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:55.047 11:45:45 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:55.047 11:45:45 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:55.047 11:45:45 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:55.047 11:45:45 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:55.047 11:45:45 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:09:55.047 11:45:45 -- common/autotest_common.sh@199 -- # cat 00:09:55.047 11:45:45 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:09:55.047 11:45:45 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:55.047 11:45:45 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:55.047 11:45:45 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:55.047 11:45:45 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:55.047 11:45:45 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:09:55.047 11:45:45 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:09:55.047 11:45:45 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:55.047 11:45:45 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:09:55.047 11:45:45 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:55.047 11:45:45 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:09:55.047 11:45:45 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:55.047 11:45:45 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:55.047 11:45:45 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:55.047 11:45:45 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:55.047 11:45:45 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:55.047 11:45:45 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:55.047 11:45:45 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:55.047 11:45:45 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:55.047 11:45:45 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:09:55.047 11:45:45 -- common/autotest_common.sh@252 -- # export valgrind= 00:09:55.047 11:45:45 -- common/autotest_common.sh@252 -- # valgrind= 00:09:55.047 11:45:45 -- common/autotest_common.sh@258 -- # uname -s 00:09:55.047 11:45:45 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:09:55.047 11:45:45 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:09:55.047 11:45:45 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:09:55.047 11:45:45 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:09:55.047 11:45:45 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:09:55.047 11:45:45 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:09:55.047 11:45:45 -- common/autotest_common.sh@268 -- # MAKE=make 00:09:55.047 11:45:45 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j72 00:09:55.047 11:45:45 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:09:55.047 11:45:45 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:09:55.047 11:45:45 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:09:55.047 11:45:45 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:09:55.047 11:45:45 -- common/autotest_common.sh@307 -- # [[ -z 377060 ]] 00:09:55.047 11:45:45 -- common/autotest_common.sh@307 -- # kill -0 377060 00:09:55.047 11:45:45 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:09:55.047 11:45:45 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:09:55.047 11:45:45 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:09:55.047 11:45:45 -- common/autotest_common.sh@320 -- # local mount target_dir 00:09:55.047 11:45:45 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:09:55.047 11:45:45 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:09:55.047 11:45:45 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:09:55.047 11:45:45 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:09:55.047 11:45:45 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.MfQBVN 00:09:55.047 11:45:45 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:55.047 11:45:45 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:09:55.047 11:45:45 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:09:55.047 11:45:45 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.MfQBVN/tests/vfio /tmp/spdk.MfQBVN 00:09:55.047 11:45:45 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:09:55.047 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.047 11:45:45 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:09:55.047 11:45:45 -- common/autotest_common.sh@316 -- # df -T 00:09:55.047 11:45:45 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:09:55.047 11:45:45 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:09:55.047 11:45:45 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:09:55.047 11:45:45 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:09:55.047 11:45:45 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:09:55.047 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.047 11:45:45 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:09:55.047 11:45:45 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:09:55.047 11:45:45 -- common/autotest_common.sh@351 -- # avails["$mount"]=997285888 00:09:55.047 11:45:45 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:09:55.047 11:45:45 -- common/autotest_common.sh@352 -- # uses["$mount"]=4287143936 00:09:55.047 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.047 11:45:45 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:09:55.047 11:45:45 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # avails["$mount"]=86197030912 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # sizes["$mount"]=94508580864 00:09:55.048 11:45:45 -- common/autotest_common.sh@352 -- # uses["$mount"]=8311549952 00:09:55.048 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # avails["$mount"]=47251677184 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254290432 00:09:55.048 11:45:45 -- common/autotest_common.sh@352 -- # uses["$mount"]=2613248 00:09:55.048 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # avails["$mount"]=18895638528 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # sizes["$mount"]=18901716992 00:09:55.048 11:45:45 -- common/autotest_common.sh@352 -- # uses["$mount"]=6078464 00:09:55.048 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # avails["$mount"]=47253745664 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254290432 00:09:55.048 11:45:45 -- common/autotest_common.sh@352 -- # uses["$mount"]=544768 00:09:55.048 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # avails["$mount"]=9450852352 00:09:55.048 11:45:45 -- common/autotest_common.sh@351 -- # sizes["$mount"]=9450856448 00:09:55.048 11:45:45 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:09:55.048 11:45:45 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:09:55.048 11:45:45 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:09:55.048 * Looking for test storage... 00:09:55.048 11:45:45 -- common/autotest_common.sh@357 -- # local target_space new_size 00:09:55.048 11:45:45 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:09:55.048 11:45:45 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:55.048 11:45:45 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:55.048 11:45:45 -- common/autotest_common.sh@361 -- # mount=/ 00:09:55.048 11:45:45 -- common/autotest_common.sh@363 -- # target_space=86197030912 00:09:55.048 11:45:45 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:09:55.048 11:45:45 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:09:55.048 11:45:45 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:09:55.048 11:45:45 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:09:55.048 11:45:45 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:09:55.048 11:45:45 -- common/autotest_common.sh@370 -- # new_size=10526142464 00:09:55.048 11:45:45 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:55.048 11:45:45 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:55.048 11:45:45 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:55.048 11:45:45 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:55.048 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:09:55.048 11:45:45 -- common/autotest_common.sh@378 -- # return 0 00:09:55.048 11:45:45 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:09:55.048 11:45:45 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:09:55.048 11:45:45 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:55.048 11:45:45 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:55.048 11:45:45 -- common/autotest_common.sh@1673 -- # true 00:09:55.048 11:45:45 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:09:55.048 11:45:45 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:09:55.048 11:45:45 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:09:55.048 11:45:45 -- common/autotest_common.sh@27 -- # exec 00:09:55.048 11:45:45 -- common/autotest_common.sh@29 -- # exec 00:09:55.048 11:45:45 -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:55.048 11:45:45 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:55.048 11:45:45 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:55.048 11:45:45 -- common/autotest_common.sh@18 -- # set -x 00:09:55.048 11:45:45 -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:09:55.048 11:45:45 -- ../common.sh@8 -- # pids=() 00:09:55.048 11:45:45 -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:55.048 11:45:45 -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:09:55.048 11:45:45 -- vfio/run.sh@68 -- # fuzz_num=7 00:09:55.048 11:45:45 -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:09:55.048 11:45:45 -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:09:55.048 11:45:45 -- vfio/run.sh@74 -- # mem_size=0 00:09:55.048 11:45:45 -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:09:55.048 11:45:45 -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:09:55.048 11:45:45 -- ../common.sh@69 -- # local fuzz_num=7 00:09:55.048 11:45:45 -- ../common.sh@70 -- # local time=1 00:09:55.048 11:45:45 -- ../common.sh@72 -- # (( i = 0 )) 00:09:55.048 11:45:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:55.048 11:45:45 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:09:55.048 11:45:45 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:09:55.048 11:45:45 -- vfio/run.sh@23 -- # local timen=1 00:09:55.048 11:45:45 -- vfio/run.sh@24 -- # local core=0x1 00:09:55.048 11:45:45 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:55.048 11:45:45 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:09:55.048 11:45:45 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:09:55.048 11:45:45 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:09:55.048 11:45:45 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:09:55.048 11:45:45 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:55.048 11:45:45 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:55.048 11:45:45 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:55.048 11:45:45 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:09:55.048 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:55.048 11:45:45 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:55.048 11:45:45 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:55.048 11:45:45 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:09:55.308 [2024-04-18 11:45:45.630930] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:55.308 [2024-04-18 11:45:45.631014] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid377184 ] 00:09:55.308 EAL: No free 2048 kB hugepages reported on node 1 00:09:55.308 [2024-04-18 11:45:45.784101] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:55.567 [2024-04-18 11:45:45.954120] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:56.136 INFO: Running with entropic power schedule (0xFF, 100). 00:09:56.136 INFO: Seed: 1984666888 00:09:56.136 INFO: Loaded 1 modules (348745 inline 8-bit counters): 348745 [0x342970c, 0x347e955), 00:09:56.136 INFO: Loaded 1 PC tables (348745 PCs): 348745 [0x347e958,0x39d0de8), 00:09:56.136 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:09:56.136 INFO: A corpus is not provided, starting from an empty corpus 00:09:56.136 #2 INITED exec/s: 0 rss: 210Mb 00:09:56.136 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:56.136 This may also happen if the target rejected all inputs we tried so far 00:09:56.136 [2024-04-18 11:45:46.447927] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:09:56.395 NEW_FUNC[1/634]: 0x548ce0 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:09:56.395 NEW_FUNC[2/634]: 0x5502d0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:219 00:09:56.654 #7 NEW cov: 10862 ft: 10780 corp: 2/19b lim: 60 exec/s: 0 rss: 223Mb L: 18/18 MS: 4 CrossOver-CopyPart-CrossOver-InsertRepeatedBytes- 00:09:57.223 #12 NEW cov: 10878 ft: 13794 corp: 3/68b lim: 60 exec/s: 12 rss: 224Mb L: 49/49 MS: 4 CopyPart-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:09:57.483 #14 NEW cov: 10878 ft: 16086 corp: 4/126b lim: 60 exec/s: 14 rss: 225Mb L: 58/58 MS: 1 CrossOver- 00:09:58.052 #16 NEW cov: 10878 ft: 16661 corp: 5/184b lim: 60 exec/s: 16 rss: 226Mb L: 58/58 MS: 1 InsertRepeatedBytes- 00:09:58.311 #18 NEW cov: 10878 ft: 17394 corp: 6/242b lim: 60 exec/s: 9 rss: 228Mb L: 58/58 MS: 1 CrossOver- 00:09:58.311 #18 DONE cov: 10878 ft: 17394 corp: 6/242b lim: 60 exec/s: 9 rss: 228Mb 00:09:58.311 Done 18 runs in 2 second(s) 00:09:58.311 [2024-04-18 11:45:48.792007] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:09:59.250 11:45:49 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:09:59.250 11:45:49 -- ../common.sh@72 -- # (( i++ )) 00:09:59.250 11:45:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:59.250 11:45:49 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:09:59.250 11:45:49 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:09:59.250 11:45:49 -- vfio/run.sh@23 -- # local timen=1 00:09:59.250 11:45:49 -- vfio/run.sh@24 -- # local core=0x1 00:09:59.250 11:45:49 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:59.250 11:45:49 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:09:59.250 11:45:49 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:09:59.250 11:45:49 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:09:59.250 11:45:49 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:09:59.250 11:45:49 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:59.250 11:45:49 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:59.250 11:45:49 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:09:59.250 11:45:49 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:09:59.250 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:59.250 11:45:49 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:59.250 11:45:49 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:59.250 11:45:49 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:09:59.509 [2024-04-18 11:45:49.844041] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:09:59.509 [2024-04-18 11:45:49.844153] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid377795 ] 00:09:59.509 EAL: No free 2048 kB hugepages reported on node 1 00:09:59.509 [2024-04-18 11:45:49.994825] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:59.768 [2024-04-18 11:45:50.174408] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:00.337 INFO: Running with entropic power schedule (0xFF, 100). 00:10:00.337 INFO: Seed: 1912791820 00:10:00.337 INFO: Loaded 1 modules (348745 inline 8-bit counters): 348745 [0x342970c, 0x347e955), 00:10:00.337 INFO: Loaded 1 PC tables (348745 PCs): 348745 [0x347e958,0x39d0de8), 00:10:00.337 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:10:00.337 INFO: A corpus is not provided, starting from an empty corpus 00:10:00.337 #2 INITED exec/s: 0 rss: 211Mb 00:10:00.337 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:00.337 This may also happen if the target rejected all inputs we tried so far 00:10:00.337 [2024-04-18 11:45:50.660141] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:10:00.337 [2024-04-18 11:45:50.704505] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:00.337 [2024-04-18 11:45:50.704536] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:00.337 [2024-04-18 11:45:50.704567] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:00.596 NEW_FUNC[1/636]: 0x549460 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:10:00.596 NEW_FUNC[2/636]: 0x5502d0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:219 00:10:00.856 [2024-04-18 11:45:51.217807] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:00.856 [2024-04-18 11:45:51.217862] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:00.856 [2024-04-18 11:45:51.217888] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:00.856 #14 NEW cov: 10867 ft: 10830 corp: 2/20b lim: 40 exec/s: 0 rss: 223Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:10:01.115 [2024-04-18 11:45:51.429516] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:01.115 [2024-04-18 11:45:51.429560] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:01.115 [2024-04-18 11:45:51.429586] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:01.115 [2024-04-18 11:45:51.631034] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:01.115 [2024-04-18 11:45:51.631066] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:01.115 [2024-04-18 11:45:51.631088] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:01.374 #16 NEW cov: 10881 ft: 14672 corp: 3/37b lim: 40 exec/s: 16 rss: 224Mb L: 17/19 MS: 1 EraseBytes- 00:10:01.374 [2024-04-18 11:45:51.855241] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:01.374 [2024-04-18 11:45:51.855273] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:01.374 [2024-04-18 11:45:51.855298] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:01.632 [2024-04-18 11:45:52.043246] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:01.633 [2024-04-18 11:45:52.043276] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:01.633 [2024-04-18 11:45:52.043297] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:01.633 #18 NEW cov: 10881 ft: 15320 corp: 4/56b lim: 40 exec/s: 18 rss: 225Mb L: 19/19 MS: 1 ChangeByte- 00:10:01.892 [2024-04-18 11:45:52.263474] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:01.892 [2024-04-18 11:45:52.263508] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:01.892 [2024-04-18 11:45:52.263534] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:02.152 [2024-04-18 11:45:52.454539] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:02.152 [2024-04-18 11:45:52.454573] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:02.152 [2024-04-18 11:45:52.454595] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:02.152 #20 NEW cov: 10881 ft: 15687 corp: 5/91b lim: 40 exec/s: 20 rss: 226Mb L: 35/35 MS: 1 CopyPart- 00:10:02.152 [2024-04-18 11:45:52.656488] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:02.152 [2024-04-18 11:45:52.656520] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:02.152 [2024-04-18 11:45:52.656544] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:02.411 [2024-04-18 11:45:52.849730] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:10:02.411 [2024-04-18 11:45:52.849761] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:10:02.411 [2024-04-18 11:45:52.849782] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:10:02.671 #22 NEW cov: 10881 ft: 15767 corp: 6/113b lim: 40 exec/s: 11 rss: 227Mb L: 22/35 MS: 1 InsertRepeatedBytes- 00:10:02.671 #22 DONE cov: 10881 ft: 15767 corp: 6/113b lim: 40 exec/s: 11 rss: 227Mb 00:10:02.671 Done 22 runs in 2 second(s) 00:10:02.671 [2024-04-18 11:45:52.992023] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:10:03.608 11:45:53 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:10:03.608 11:45:53 -- ../common.sh@72 -- # (( i++ )) 00:10:03.608 11:45:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:03.608 11:45:53 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:10:03.608 11:45:53 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:10:03.608 11:45:53 -- vfio/run.sh@23 -- # local timen=1 00:10:03.608 11:45:53 -- vfio/run.sh@24 -- # local core=0x1 00:10:03.608 11:45:53 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:03.608 11:45:53 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:10:03.608 11:45:53 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:10:03.608 11:45:53 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:10:03.608 11:45:53 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:10:03.608 11:45:53 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:10:03.608 11:45:53 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:10:03.608 11:45:53 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:03.608 11:45:53 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:10:03.608 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:03.608 11:45:53 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:10:03.608 11:45:53 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:10:03.608 11:45:54 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:10:03.608 [2024-04-18 11:45:54.046812] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:10:03.609 [2024-04-18 11:45:54.046905] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid378337 ] 00:10:03.609 EAL: No free 2048 kB hugepages reported on node 1 00:10:03.868 [2024-04-18 11:45:54.198855] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:03.868 [2024-04-18 11:45:54.370285] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:04.436 INFO: Running with entropic power schedule (0xFF, 100). 00:10:04.436 INFO: Seed: 1817683515 00:10:04.436 INFO: Loaded 1 modules (348745 inline 8-bit counters): 348745 [0x342970c, 0x347e955), 00:10:04.436 INFO: Loaded 1 PC tables (348745 PCs): 348745 [0x347e958,0x39d0de8), 00:10:04.436 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:10:04.436 INFO: A corpus is not provided, starting from an empty corpus 00:10:04.436 #2 INITED exec/s: 0 rss: 210Mb 00:10:04.436 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:04.436 This may also happen if the target rejected all inputs we tried so far 00:10:04.436 [2024-04-18 11:45:54.864662] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:10:04.436 [2024-04-18 11:45:54.892324] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:04.954 NEW_FUNC[1/635]: 0x54a090 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:10:04.954 NEW_FUNC[2/635]: 0x5502d0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:219 00:10:04.954 [2024-04-18 11:45:55.364725] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:04.954 #8 NEW cov: 10847 ft: 10801 corp: 2/10b lim: 80 exec/s: 0 rss: 223Mb L: 9/9 MS: 4 ChangeBit-CMP-ChangeBit-CMP- DE: "\034\000\000\000"-"\000\000\000\012"- 00:10:05.213 [2024-04-18 11:45:55.526046] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:05.213 [2024-04-18 11:45:55.662312] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:05.472 #10 NEW cov: 10861 ft: 13686 corp: 3/19b lim: 80 exec/s: 0 rss: 224Mb L: 9/9 MS: 1 ShuffleBytes- 00:10:05.472 [2024-04-18 11:45:55.819071] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:05.472 NEW_FUNC[1/1]: 0x169e950 in free_qp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:1885 00:10:05.472 [2024-04-18 11:45:55.965272] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:05.730 #12 NEW cov: 10872 ft: 14287 corp: 4/28b lim: 80 exec/s: 12 rss: 225Mb L: 9/9 MS: 1 ShuffleBytes- 00:10:05.730 [2024-04-18 11:45:56.121290] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:05.730 [2024-04-18 11:45:56.257143] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:05.989 #19 NEW cov: 10872 ft: 14502 corp: 5/46b lim: 80 exec/s: 19 rss: 226Mb L: 18/18 MS: 1 CrossOver- 00:10:05.989 [2024-04-18 11:45:56.414658] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:05.989 NEW_FUNC[1/1]: 0x1d68c60 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:06.248 [2024-04-18 11:45:56.551356] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:06.248 #26 NEW cov: 10889 ft: 14954 corp: 6/64b lim: 80 exec/s: 26 rss: 228Mb L: 18/18 MS: 1 ChangeByte- 00:10:06.248 [2024-04-18 11:45:56.708922] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:06.507 [2024-04-18 11:45:56.845729] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:10:06.507 #28 NEW cov: 10892 ft: 15087 corp: 7/73b lim: 80 exec/s: 14 rss: 229Mb L: 9/18 MS: 1 ChangeBinInt- 00:10:06.507 #28 DONE cov: 10892 ft: 15087 corp: 7/73b lim: 80 exec/s: 14 rss: 229Mb 00:10:06.507 ###### Recommended dictionary. ###### 00:10:06.507 "\034\000\000\000" # Uses: 0 00:10:06.507 "\000\000\000\012" # Uses: 0 00:10:06.507 ###### End of recommended dictionary. ###### 00:10:06.507 Done 28 runs in 2 second(s) 00:10:06.507 [2024-04-18 11:45:56.970016] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:10:07.445 11:45:57 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:10:07.445 11:45:57 -- ../common.sh@72 -- # (( i++ )) 00:10:07.445 11:45:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:07.445 11:45:57 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:10:07.445 11:45:57 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:10:07.445 11:45:57 -- vfio/run.sh@23 -- # local timen=1 00:10:07.445 11:45:57 -- vfio/run.sh@24 -- # local core=0x1 00:10:07.445 11:45:57 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:07.445 11:45:57 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:10:07.445 11:45:57 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:10:07.445 11:45:57 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:10:07.445 11:45:57 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:10:07.445 11:45:57 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:10:07.445 11:45:57 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:10:07.445 11:45:57 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:07.445 11:45:57 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:10:07.445 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:07.445 11:45:57 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:10:07.445 11:45:57 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:10:07.445 11:45:57 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:10:07.704 [2024-04-18 11:45:58.021617] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:10:07.704 [2024-04-18 11:45:58.021708] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid378880 ] 00:10:07.704 EAL: No free 2048 kB hugepages reported on node 1 00:10:07.704 [2024-04-18 11:45:58.171653] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:07.964 [2024-04-18 11:45:58.341917] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:08.223 INFO: Running with entropic power schedule (0xFF, 100). 00:10:08.223 INFO: Seed: 1490739666 00:10:08.481 INFO: Loaded 1 modules (348745 inline 8-bit counters): 348745 [0x342970c, 0x347e955), 00:10:08.481 INFO: Loaded 1 PC tables (348745 PCs): 348745 [0x347e958,0x39d0de8), 00:10:08.481 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:10:08.481 INFO: A corpus is not provided, starting from an empty corpus 00:10:08.481 #2 INITED exec/s: 0 rss: 211Mb 00:10:08.481 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:08.482 This may also happen if the target rejected all inputs we tried so far 00:10:08.482 [2024-04-18 11:45:58.895123] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:10:08.999 NEW_FUNC[1/629]: 0x54a8e0 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:10:08.999 NEW_FUNC[2/629]: 0x5502d0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:219 00:10:09.258 #12 NEW cov: 10762 ft: 10811 corp: 2/38b lim: 320 exec/s: 0 rss: 224Mb L: 37/37 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:10:09.258 [2024-04-18 11:45:59.694315] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=330 offset=0 prot=0x3: Invalid argument 00:10:09.258 [2024-04-18 11:45:59.694371] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:09.258 [2024-04-18 11:45:59.694387] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:09.258 [2024-04-18 11:45:59.694420] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:09.517 NEW_FUNC[1/7]: 0x16418a0 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3094 00:10:09.517 NEW_FUNC[2/7]: 0x2088b60 in accel_sequence_check_virtbuf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel.c:1504 00:10:09.517 [2024-04-18 11:45:59.917771] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=330 offset=0 prot=0x3: Invalid argument 00:10:09.517 [2024-04-18 11:45:59.917807] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:09.517 [2024-04-18 11:45:59.917822] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:09.517 [2024-04-18 11:45:59.917843] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:09.517 #18 NEW cov: 10879 ft: 14594 corp: 3/156b lim: 320 exec/s: 18 rss: 224Mb L: 118/118 MS: 5 ShuffleBytes-ChangeByte-CrossOver-CMP-InsertRepeatedBytes- DE: "\201\000\000\000\000\000\000\000"- 00:10:09.776 [2024-04-18 11:46:00.130230] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=330 offset=0 prot=0x3: Invalid argument 00:10:09.776 [2024-04-18 11:46:00.130269] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:09.776 [2024-04-18 11:46:00.130285] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:09.776 [2024-04-18 11:46:00.130313] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:09.776 [2024-04-18 11:46:00.312959] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=330 offset=0 prot=0x3: Invalid argument 00:10:09.776 [2024-04-18 11:46:00.312989] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:09.776 [2024-04-18 11:46:00.313004] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:09.776 [2024-04-18 11:46:00.313025] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:10.035 #20 NEW cov: 10879 ft: 15176 corp: 4/246b lim: 320 exec/s: 20 rss: 225Mb L: 90/118 MS: 1 EraseBytes- 00:10:10.035 [2024-04-18 11:46:00.530302] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=330 offset=0 prot=0x3: Invalid argument 00:10:10.035 [2024-04-18 11:46:00.530338] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:10.035 [2024-04-18 11:46:00.530352] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:10.035 [2024-04-18 11:46:00.530376] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:10.295 [2024-04-18 11:46:00.719517] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=330 offset=0 prot=0x3: Invalid argument 00:10:10.295 [2024-04-18 11:46:00.719546] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:10:10.295 [2024-04-18 11:46:00.719561] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:10:10.295 [2024-04-18 11:46:00.719580] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:10:10.555 #22 NEW cov: 10879 ft: 15760 corp: 5/337b lim: 320 exec/s: 11 rss: 227Mb L: 91/118 MS: 1 InsertByte- 00:10:10.555 #22 DONE cov: 10879 ft: 15760 corp: 5/337b lim: 320 exec/s: 11 rss: 227Mb 00:10:10.555 ###### Recommended dictionary. ###### 00:10:10.555 "\201\000\000\000\000\000\000\000" # Uses: 0 00:10:10.555 ###### End of recommended dictionary. ###### 00:10:10.555 Done 22 runs in 2 second(s) 00:10:10.555 [2024-04-18 11:46:00.882978] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:10:11.494 11:46:01 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:10:11.494 11:46:01 -- ../common.sh@72 -- # (( i++ )) 00:10:11.494 11:46:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:11.494 11:46:01 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:10:11.494 11:46:01 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:10:11.494 11:46:01 -- vfio/run.sh@23 -- # local timen=1 00:10:11.494 11:46:01 -- vfio/run.sh@24 -- # local core=0x1 00:10:11.494 11:46:01 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:11.494 11:46:01 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:10:11.494 11:46:01 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:10:11.494 11:46:01 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:10:11.494 11:46:01 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:10:11.494 11:46:01 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:10:11.494 11:46:01 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:10:11.494 11:46:01 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:11.494 11:46:01 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:10:11.494 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:11.494 11:46:01 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:10:11.494 11:46:01 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:10:11.494 11:46:01 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:10:11.494 [2024-04-18 11:46:01.925086] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:10:11.494 [2024-04-18 11:46:01.925187] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid379376 ] 00:10:11.494 EAL: No free 2048 kB hugepages reported on node 1 00:10:11.754 [2024-04-18 11:46:02.067970] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:11.754 [2024-04-18 11:46:02.236215] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:12.324 INFO: Running with entropic power schedule (0xFF, 100). 00:10:12.324 INFO: Seed: 1098861683 00:10:12.324 INFO: Loaded 1 modules (348745 inline 8-bit counters): 348745 [0x342970c, 0x347e955), 00:10:12.324 INFO: Loaded 1 PC tables (348745 PCs): 348745 [0x347e958,0x39d0de8), 00:10:12.324 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:10:12.324 INFO: A corpus is not provided, starting from an empty corpus 00:10:12.324 #2 INITED exec/s: 0 rss: 211Mb 00:10:12.324 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:12.324 This may also happen if the target rejected all inputs we tried so far 00:10:12.324 [2024-04-18 11:46:02.745562] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:10:12.842 NEW_FUNC[1/635]: 0x54b440 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:10:12.842 NEW_FUNC[2/635]: 0x5502d0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:219 00:10:13.101 #18 NEW cov: 10857 ft: 10803 corp: 2/120b lim: 320 exec/s: 0 rss: 223Mb L: 119/119 MS: 4 ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:10:13.361 #20 NEW cov: 10874 ft: 14201 corp: 3/239b lim: 320 exec/s: 20 rss: 224Mb L: 119/119 MS: 1 CrossOver- 00:10:13.929 #27 NEW cov: 10874 ft: 15704 corp: 4/292b lim: 320 exec/s: 27 rss: 225Mb L: 53/119 MS: 1 InsertRepeatedBytes- 00:10:14.500 #29 NEW cov: 10874 ft: 15967 corp: 5/412b lim: 320 exec/s: 14 rss: 226Mb L: 120/120 MS: 1 InsertByte- 00:10:14.500 #29 DONE cov: 10874 ft: 15967 corp: 5/412b lim: 320 exec/s: 14 rss: 226Mb 00:10:14.500 Done 29 runs in 2 second(s) 00:10:14.500 [2024-04-18 11:46:04.767969] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:10:15.440 11:46:05 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:10:15.440 11:46:05 -- ../common.sh@72 -- # (( i++ )) 00:10:15.440 11:46:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:15.440 11:46:05 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:10:15.440 11:46:05 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:10:15.440 11:46:05 -- vfio/run.sh@23 -- # local timen=1 00:10:15.440 11:46:05 -- vfio/run.sh@24 -- # local core=0x1 00:10:15.440 11:46:05 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:15.440 11:46:05 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:10:15.440 11:46:05 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:10:15.440 11:46:05 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:10:15.440 11:46:05 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:10:15.440 11:46:05 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:10:15.440 11:46:05 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:10:15.440 11:46:05 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:15.440 11:46:05 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:10:15.440 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:15.440 11:46:05 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:10:15.440 11:46:05 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:10:15.440 11:46:05 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:10:15.441 [2024-04-18 11:46:05.817927] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:10:15.441 [2024-04-18 11:46:05.818014] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid379874 ] 00:10:15.441 EAL: No free 2048 kB hugepages reported on node 1 00:10:15.441 [2024-04-18 11:46:05.964276] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:15.701 [2024-04-18 11:46:06.128883] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:16.272 INFO: Running with entropic power schedule (0xFF, 100). 00:10:16.272 INFO: Seed: 680789615 00:10:16.272 INFO: Loaded 1 modules (348745 inline 8-bit counters): 348745 [0x342970c, 0x347e955), 00:10:16.272 INFO: Loaded 1 PC tables (348745 PCs): 348745 [0x347e958,0x39d0de8), 00:10:16.272 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:10:16.272 INFO: A corpus is not provided, starting from an empty corpus 00:10:16.272 #2 INITED exec/s: 0 rss: 211Mb 00:10:16.272 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:16.272 This may also happen if the target rejected all inputs we tried so far 00:10:16.272 [2024-04-18 11:46:06.611444] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:10:16.272 [2024-04-18 11:46:06.657492] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:16.272 [2024-04-18 11:46:06.657535] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:16.843 NEW_FUNC[1/596]: 0x54c3b0 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:10:16.843 NEW_FUNC[2/596]: 0x5502d0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:219 00:10:16.843 [2024-04-18 11:46:07.172875] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:16.843 [2024-04-18 11:46:07.172939] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:16.843 #4 NEW cov: 9979 ft: 10809 corp: 2/83b lim: 120 exec/s: 0 rss: 223Mb L: 82/82 MS: 1 InsertRepeatedBytes- 00:10:16.843 [2024-04-18 11:46:07.388978] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:16.843 [2024-04-18 11:46:07.389029] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:17.103 NEW_FUNC[1/40]: 0x1e20c60 in spdk_bdev_get_num_blocks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/bdev/bdev.c:4699 00:10:17.103 NEW_FUNC[2/40]: 0x1e34910 in spdk_bdev_readv_blocks_ext /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/bdev/bdev.c:5432 00:10:17.103 [2024-04-18 11:46:07.600538] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:17.103 [2024-04-18 11:46:07.600583] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:17.396 #16 NEW cov: 10880 ft: 14390 corp: 3/165b lim: 120 exec/s: 16 rss: 225Mb L: 82/82 MS: 1 ChangeBinInt- 00:10:17.396 [2024-04-18 11:46:07.826687] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:17.396 [2024-04-18 11:46:07.826737] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:17.701 [2024-04-18 11:46:08.019854] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:17.701 [2024-04-18 11:46:08.019894] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:17.701 #18 NEW cov: 10883 ft: 14810 corp: 4/248b lim: 120 exec/s: 18 rss: 226Mb L: 83/83 MS: 1 InsertByte- 00:10:17.963 [2024-04-18 11:46:08.245412] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:17.963 [2024-04-18 11:46:08.245474] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:17.963 [2024-04-18 11:46:08.440777] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:17.963 [2024-04-18 11:46:08.440815] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:18.223 #20 NEW cov: 10883 ft: 15143 corp: 5/330b lim: 120 exec/s: 10 rss: 227Mb L: 82/83 MS: 1 ShuffleBytes- 00:10:18.223 #20 DONE cov: 10883 ft: 15143 corp: 5/330b lim: 120 exec/s: 10 rss: 227Mb 00:10:18.223 Done 20 runs in 2 second(s) 00:10:18.223 [2024-04-18 11:46:08.600993] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:10:19.162 11:46:09 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:10:19.162 11:46:09 -- ../common.sh@72 -- # (( i++ )) 00:10:19.162 11:46:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:19.162 11:46:09 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:10:19.162 11:46:09 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:10:19.162 11:46:09 -- vfio/run.sh@23 -- # local timen=1 00:10:19.162 11:46:09 -- vfio/run.sh@24 -- # local core=0x1 00:10:19.162 11:46:09 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:19.162 11:46:09 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:10:19.162 11:46:09 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:10:19.162 11:46:09 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:10:19.162 11:46:09 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:10:19.162 11:46:09 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:10:19.162 11:46:09 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:10:19.162 11:46:09 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:19.162 11:46:09 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:10:19.162 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:10:19.162 11:46:09 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:10:19.162 11:46:09 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:10:19.162 11:46:09 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:10:19.162 [2024-04-18 11:46:09.653107] Starting SPDK v24.05-pre git sha1 65b4e17c6 / DPDK 23.11.0 initialization... 00:10:19.162 [2024-04-18 11:46:09.653193] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid380334 ] 00:10:19.421 EAL: No free 2048 kB hugepages reported on node 1 00:10:19.421 [2024-04-18 11:46:09.805794] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:19.681 [2024-04-18 11:46:09.981716] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:19.940 INFO: Running with entropic power schedule (0xFF, 100). 00:10:19.940 INFO: Seed: 242819327 00:10:19.940 INFO: Loaded 1 modules (348745 inline 8-bit counters): 348745 [0x342970c, 0x347e955), 00:10:19.940 INFO: Loaded 1 PC tables (348745 PCs): 348745 [0x347e958,0x39d0de8), 00:10:19.940 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:10:19.940 INFO: A corpus is not provided, starting from an empty corpus 00:10:19.940 #2 INITED exec/s: 0 rss: 210Mb 00:10:19.940 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:10:19.940 This may also happen if the target rejected all inputs we tried so far 00:10:19.940 [2024-04-18 11:46:10.465367] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:10:20.199 [2024-04-18 11:46:10.502487] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:20.199 [2024-04-18 11:46:10.502559] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:20.458 NEW_FUNC[1/636]: 0x54d2c0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:189 00:10:20.458 NEW_FUNC[2/636]: 0x5502d0 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:219 00:10:20.718 [2024-04-18 11:46:11.012102] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:20.718 [2024-04-18 11:46:11.012167] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:20.718 #6 NEW cov: 10858 ft: 10857 corp: 2/12b lim: 90 exec/s: 0 rss: 223Mb L: 11/11 MS: 3 InsertByte-InsertByte-CMP- DE: "\221\273\276\243\007\376\004\000"- 00:10:20.718 [2024-04-18 11:46:11.229709] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:20.718 [2024-04-18 11:46:11.229757] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:20.977 [2024-04-18 11:46:11.417679] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:20.977 [2024-04-18 11:46:11.417719] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:21.236 #13 NEW cov: 10872 ft: 14813 corp: 3/23b lim: 90 exec/s: 13 rss: 224Mb L: 11/11 MS: 1 ChangeByte- 00:10:21.237 [2024-04-18 11:46:11.621283] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:21.237 [2024-04-18 11:46:11.621324] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:21.496 [2024-04-18 11:46:11.807609] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:21.496 [2024-04-18 11:46:11.807647] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:21.496 #20 NEW cov: 10875 ft: 15976 corp: 4/32b lim: 90 exec/s: 20 rss: 225Mb L: 9/11 MS: 1 PersAutoDict- DE: "\221\273\276\243\007\376\004\000"- 00:10:21.496 [2024-04-18 11:46:12.018755] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:21.496 [2024-04-18 11:46:12.018797] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:21.755 NEW_FUNC[1/1]: 0x1d68c60 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:10:21.755 [2024-04-18 11:46:12.195002] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:21.755 [2024-04-18 11:46:12.195040] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:22.014 #22 NEW cov: 10892 ft: 16327 corp: 5/41b lim: 90 exec/s: 22 rss: 226Mb L: 9/11 MS: 1 ChangeByte- 00:10:22.014 [2024-04-18 11:46:12.402231] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:22.014 [2024-04-18 11:46:12.402274] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:22.273 [2024-04-18 11:46:12.580389] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:10:22.273 [2024-04-18 11:46:12.580460] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:10:22.273 #24 NEW cov: 10892 ft: 17075 corp: 6/50b lim: 90 exec/s: 12 rss: 228Mb L: 9/11 MS: 1 CMP- DE: "\357\001"- 00:10:22.273 #24 DONE cov: 10892 ft: 17075 corp: 6/50b lim: 90 exec/s: 12 rss: 228Mb 00:10:22.273 ###### Recommended dictionary. ###### 00:10:22.273 "\221\273\276\243\007\376\004\000" # Uses: 1 00:10:22.273 "\357\001" # Uses: 0 00:10:22.273 ###### End of recommended dictionary. ###### 00:10:22.273 Done 24 runs in 2 second(s) 00:10:22.273 [2024-04-18 11:46:12.737994] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:10:23.210 11:46:13 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:10:23.210 11:46:13 -- ../common.sh@72 -- # (( i++ )) 00:10:23.210 11:46:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:10:23.210 11:46:13 -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:10:23.210 00:10:23.210 real 0m28.418s 00:10:23.210 user 0m33.958s 00:10:23.210 sys 0m3.134s 00:10:23.210 11:46:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:10:23.210 11:46:13 -- common/autotest_common.sh@10 -- # set +x 00:10:23.210 ************************************ 00:10:23.210 END TEST vfio_fuzz 00:10:23.210 ************************************ 00:10:23.210 11:46:13 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:10:23.210 00:10:23.210 real 1m51.125s 00:10:23.210 user 2m16.422s 00:10:23.210 sys 0m15.336s 00:10:23.210 11:46:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:10:23.210 11:46:13 -- common/autotest_common.sh@10 -- # set +x 00:10:23.210 ************************************ 00:10:23.210 END TEST llvm_fuzz 00:10:23.210 ************************************ 00:10:23.469 11:46:13 -- spdk/autotest.sh@373 -- # [[ 0 -eq 1 ]] 00:10:23.469 11:46:13 -- spdk/autotest.sh@378 -- # trap - SIGINT SIGTERM EXIT 00:10:23.469 11:46:13 -- spdk/autotest.sh@380 -- # timing_enter post_cleanup 00:10:23.469 11:46:13 -- common/autotest_common.sh@710 -- # xtrace_disable 00:10:23.469 11:46:13 -- common/autotest_common.sh@10 -- # set +x 00:10:23.469 11:46:13 -- spdk/autotest.sh@381 -- # autotest_cleanup 00:10:23.469 11:46:13 -- common/autotest_common.sh@1378 -- # local autotest_es=0 00:10:23.469 11:46:13 -- common/autotest_common.sh@1379 -- # xtrace_disable 00:10:23.469 11:46:13 -- common/autotest_common.sh@10 -- # set +x 00:10:27.660 INFO: APP EXITING 00:10:27.660 INFO: killing all VMs 00:10:27.660 INFO: killing vhost app 00:10:27.660 INFO: EXIT DONE 00:10:31.855 Waiting for block devices as requested 00:10:31.855 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:10:31.855 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:31.855 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:31.855 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:31.855 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:32.113 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:32.113 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:32.113 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:32.113 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:32.372 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:10:32.372 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:10:32.372 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:10:32.631 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:10:32.631 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:10:32.631 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:10:32.889 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:10:32.889 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:10:38.189 Cleaning 00:10:38.189 Removing: /dev/shm/spdk_tgt_trace.pid339598 00:10:38.189 Removing: /var/run/dpdk/spdk_pid336326 00:10:38.189 Removing: /var/run/dpdk/spdk_pid337720 00:10:38.189 Removing: /var/run/dpdk/spdk_pid339598 00:10:38.189 Removing: /var/run/dpdk/spdk_pid340484 00:10:38.190 Removing: /var/run/dpdk/spdk_pid341395 00:10:38.451 Removing: /var/run/dpdk/spdk_pid341758 00:10:38.451 Removing: /var/run/dpdk/spdk_pid342710 00:10:38.451 Removing: /var/run/dpdk/spdk_pid342887 00:10:38.451 Removing: /var/run/dpdk/spdk_pid343412 00:10:38.451 Removing: /var/run/dpdk/spdk_pid344006 00:10:38.451 Removing: /var/run/dpdk/spdk_pid344534 00:10:38.451 Removing: /var/run/dpdk/spdk_pid345363 00:10:38.451 Removing: /var/run/dpdk/spdk_pid345954 00:10:38.451 Removing: /var/run/dpdk/spdk_pid346165 00:10:38.451 Removing: /var/run/dpdk/spdk_pid346377 00:10:38.451 Removing: /var/run/dpdk/spdk_pid346771 00:10:38.451 Removing: /var/run/dpdk/spdk_pid347544 00:10:38.451 Removing: /var/run/dpdk/spdk_pid350067 00:10:38.451 Removing: /var/run/dpdk/spdk_pid350629 00:10:38.451 Removing: /var/run/dpdk/spdk_pid351025 00:10:38.451 Removing: /var/run/dpdk/spdk_pid351197 00:10:38.451 Removing: /var/run/dpdk/spdk_pid352222 00:10:38.451 Removing: /var/run/dpdk/spdk_pid352305 00:10:38.451 Removing: /var/run/dpdk/spdk_pid353399 00:10:38.451 Removing: /var/run/dpdk/spdk_pid353530 00:10:38.451 Removing: /var/run/dpdk/spdk_pid353961 00:10:38.451 Removing: /var/run/dpdk/spdk_pid353986 00:10:38.451 Removing: /var/run/dpdk/spdk_pid354369 00:10:38.451 Removing: /var/run/dpdk/spdk_pid354547 00:10:38.451 Removing: /var/run/dpdk/spdk_pid355382 00:10:38.451 Removing: /var/run/dpdk/spdk_pid355746 00:10:38.451 Removing: /var/run/dpdk/spdk_pid355949 00:10:38.451 Removing: /var/run/dpdk/spdk_pid356197 00:10:38.451 Removing: /var/run/dpdk/spdk_pid356605 00:10:38.451 Removing: /var/run/dpdk/spdk_pid356805 00:10:38.451 Removing: /var/run/dpdk/spdk_pid357175 00:10:38.451 Removing: /var/run/dpdk/spdk_pid357444 00:10:38.451 Removing: /var/run/dpdk/spdk_pid357813 00:10:38.451 Removing: /var/run/dpdk/spdk_pid358144 00:10:38.451 Removing: /var/run/dpdk/spdk_pid358401 00:10:38.451 Removing: /var/run/dpdk/spdk_pid358772 00:10:38.451 Removing: /var/run/dpdk/spdk_pid359144 00:10:38.451 Removing: /var/run/dpdk/spdk_pid359365 00:10:38.451 Removing: /var/run/dpdk/spdk_pid359733 00:10:38.451 Removing: /var/run/dpdk/spdk_pid360107 00:10:38.451 Removing: /var/run/dpdk/spdk_pid360319 00:10:38.451 Removing: /var/run/dpdk/spdk_pid360691 00:10:38.451 Removing: /var/run/dpdk/spdk_pid361067 00:10:38.451 Removing: /var/run/dpdk/spdk_pid361289 00:10:38.451 Removing: /var/run/dpdk/spdk_pid361651 00:10:38.451 Removing: /var/run/dpdk/spdk_pid362029 00:10:38.451 Removing: /var/run/dpdk/spdk_pid362399 00:10:38.451 Removing: /var/run/dpdk/spdk_pid362628 00:10:38.451 Removing: /var/run/dpdk/spdk_pid362988 00:10:38.451 Removing: /var/run/dpdk/spdk_pid363365 00:10:38.451 Removing: /var/run/dpdk/spdk_pid363709 00:10:38.451 Removing: /var/run/dpdk/spdk_pid363987 00:10:38.451 Removing: /var/run/dpdk/spdk_pid364527 00:10:38.451 Removing: /var/run/dpdk/spdk_pid365320 00:10:38.451 Removing: /var/run/dpdk/spdk_pid365723 00:10:38.451 Removing: /var/run/dpdk/spdk_pid366157 00:10:38.451 Removing: /var/run/dpdk/spdk_pid366651 00:10:38.451 Removing: /var/run/dpdk/spdk_pid367034 00:10:38.451 Removing: /var/run/dpdk/spdk_pid367571 00:10:38.451 Removing: /var/run/dpdk/spdk_pid367955 00:10:38.711 Removing: /var/run/dpdk/spdk_pid368449 00:10:38.711 Removing: /var/run/dpdk/spdk_pid368882 00:10:38.711 Removing: /var/run/dpdk/spdk_pid369305 00:10:38.711 Removing: /var/run/dpdk/spdk_pid369800 00:10:38.711 Removing: /var/run/dpdk/spdk_pid370187 00:10:38.711 Removing: /var/run/dpdk/spdk_pid370848 00:10:38.711 Removing: /var/run/dpdk/spdk_pid371621 00:10:38.711 Removing: /var/run/dpdk/spdk_pid372141 00:10:38.711 Removing: /var/run/dpdk/spdk_pid372553 00:10:38.711 Removing: /var/run/dpdk/spdk_pid372988 00:10:38.711 Removing: /var/run/dpdk/spdk_pid373477 00:10:38.711 Removing: /var/run/dpdk/spdk_pid373859 00:10:38.711 Removing: /var/run/dpdk/spdk_pid374399 00:10:38.711 Removing: /var/run/dpdk/spdk_pid374787 00:10:38.711 Removing: /var/run/dpdk/spdk_pid375275 00:10:38.711 Removing: /var/run/dpdk/spdk_pid375711 00:10:38.711 Removing: /var/run/dpdk/spdk_pid376094 00:10:38.711 Removing: /var/run/dpdk/spdk_pid376628 00:10:38.711 Removing: /var/run/dpdk/spdk_pid377184 00:10:38.711 Removing: /var/run/dpdk/spdk_pid377795 00:10:38.711 Removing: /var/run/dpdk/spdk_pid378337 00:10:38.711 Removing: /var/run/dpdk/spdk_pid378880 00:10:38.711 Removing: /var/run/dpdk/spdk_pid379376 00:10:38.711 Removing: /var/run/dpdk/spdk_pid379874 00:10:38.711 Removing: /var/run/dpdk/spdk_pid380334 00:10:38.711 Clean 00:10:38.971 11:46:29 -- common/autotest_common.sh@1437 -- # return 0 00:10:38.971 11:46:29 -- spdk/autotest.sh@382 -- # timing_exit post_cleanup 00:10:38.971 11:46:29 -- common/autotest_common.sh@716 -- # xtrace_disable 00:10:38.971 11:46:29 -- common/autotest_common.sh@10 -- # set +x 00:10:38.971 11:46:29 -- spdk/autotest.sh@384 -- # timing_exit autotest 00:10:38.971 11:46:29 -- common/autotest_common.sh@716 -- # xtrace_disable 00:10:38.971 11:46:29 -- common/autotest_common.sh@10 -- # set +x 00:10:38.971 11:46:29 -- spdk/autotest.sh@385 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:38.971 11:46:29 -- spdk/autotest.sh@387 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:10:38.971 11:46:29 -- spdk/autotest.sh@387 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:10:38.971 11:46:29 -- spdk/autotest.sh@389 -- # hash lcov 00:10:38.971 11:46:29 -- spdk/autotest.sh@389 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:10:38.971 11:46:29 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:10:38.971 11:46:29 -- scripts/common.sh@502 -- $ [[ -e /bin/wpdk_common.sh ]] 00:10:38.971 11:46:29 -- scripts/common.sh@510 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:38.971 11:46:29 -- scripts/common.sh@511 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:38.971 11:46:29 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:38.971 11:46:29 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:38.971 11:46:29 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:38.971 11:46:29 -- paths/export.sh@5 -- $ export PATH 00:10:38.971 11:46:29 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:38.971 11:46:29 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:10:38.971 11:46:29 -- common/autobuild_common.sh@435 -- $ date +%s 00:10:39.230 11:46:29 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1713433589.XXXXXX 00:10:39.230 11:46:29 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1713433589.7XDaG7 00:10:39.230 11:46:29 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:10:39.230 11:46:29 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:10:39.230 11:46:29 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:10:39.230 11:46:29 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:10:39.230 11:46:29 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:10:39.230 11:46:29 -- common/autobuild_common.sh@451 -- $ get_config_params 00:10:39.230 11:46:29 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:10:39.230 11:46:29 -- common/autotest_common.sh@10 -- $ set +x 00:10:39.230 11:46:29 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-vfio-user' 00:10:39.230 11:46:29 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:10:39.230 11:46:29 -- pm/common@17 -- $ local monitor 00:10:39.230 11:46:29 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:39.230 11:46:29 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=386664 00:10:39.230 11:46:29 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:39.230 11:46:29 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=386665 00:10:39.230 11:46:29 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:39.230 11:46:29 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=386667 00:10:39.230 11:46:29 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:39.230 11:46:29 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=386670 00:10:39.230 11:46:29 -- pm/common@26 -- $ sleep 1 00:10:39.230 11:46:29 -- pm/common@21 -- $ date +%s 00:10:39.230 11:46:29 -- pm/common@21 -- $ date +%s 00:10:39.231 11:46:29 -- pm/common@21 -- $ date +%s 00:10:39.231 11:46:29 -- pm/common@21 -- $ date +%s 00:10:39.231 11:46:29 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713433589 00:10:39.231 11:46:29 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713433589 00:10:39.231 11:46:29 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713433589 00:10:39.231 11:46:29 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713433589 00:10:39.231 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713433589_collect-vmstat.pm.log 00:10:39.231 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713433589_collect-cpu-load.pm.log 00:10:39.231 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713433589_collect-bmc-pm.bmc.pm.log 00:10:39.231 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713433589_collect-cpu-temp.pm.log 00:10:40.169 11:46:30 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:10:40.169 11:46:30 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:10:40.169 11:46:30 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:40.169 11:46:30 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:10:40.169 11:46:30 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:10:40.169 11:46:30 -- spdk/autopackage.sh@19 -- $ timing_finish 00:10:40.169 11:46:30 -- common/autotest_common.sh@722 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:40.169 11:46:30 -- common/autotest_common.sh@723 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:10:40.169 11:46:30 -- common/autotest_common.sh@725 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:40.169 11:46:30 -- spdk/autopackage.sh@20 -- $ exit 0 00:10:40.169 11:46:30 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:10:40.169 11:46:30 -- pm/common@30 -- $ signal_monitor_resources TERM 00:10:40.169 11:46:30 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:10:40.169 11:46:30 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:40.169 11:46:30 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:10:40.169 11:46:30 -- pm/common@45 -- $ pid=386698 00:10:40.169 11:46:30 -- pm/common@52 -- $ sudo kill -TERM 386698 00:10:40.169 11:46:30 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:40.169 11:46:30 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:10:40.169 11:46:30 -- pm/common@45 -- $ pid=386695 00:10:40.169 11:46:30 -- pm/common@52 -- $ sudo kill -TERM 386695 00:10:40.169 11:46:30 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:40.169 11:46:30 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:10:40.169 11:46:30 -- pm/common@45 -- $ pid=386703 00:10:40.169 11:46:30 -- pm/common@52 -- $ sudo kill -TERM 386703 00:10:40.169 11:46:30 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:10:40.169 11:46:30 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:10:40.169 11:46:30 -- pm/common@45 -- $ pid=386706 00:10:40.169 11:46:30 -- pm/common@52 -- $ sudo kill -TERM 386706 00:10:40.428 + [[ -n 228835 ]] 00:10:40.428 + sudo kill 228835 00:10:40.438 [Pipeline] } 00:10:40.459 [Pipeline] // stage 00:10:40.464 [Pipeline] } 00:10:40.483 [Pipeline] // timeout 00:10:40.490 [Pipeline] } 00:10:40.507 [Pipeline] // catchError 00:10:40.513 [Pipeline] } 00:10:40.532 [Pipeline] // wrap 00:10:40.538 [Pipeline] } 00:10:40.553 [Pipeline] // catchError 00:10:40.560 [Pipeline] stage 00:10:40.562 [Pipeline] { (Epilogue) 00:10:40.576 [Pipeline] catchError 00:10:40.578 [Pipeline] { 00:10:40.591 [Pipeline] echo 00:10:40.593 Cleanup processes 00:10:40.599 [Pipeline] sh 00:10:40.884 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:40.884 286186 sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713433081 00:10:40.884 286218 bash /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713433081 00:10:40.884 386851 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:10:40.884 387633 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:40.899 [Pipeline] sh 00:10:41.183 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:41.183 ++ grep -v 'sudo pgrep' 00:10:41.183 ++ awk '{print $1}' 00:10:41.183 + sudo kill -9 286186 286218 386851 00:10:41.195 [Pipeline] sh 00:10:41.480 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:42.925 [Pipeline] sh 00:10:43.211 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:43.211 Artifacts sizes are good 00:10:43.226 [Pipeline] archiveArtifacts 00:10:43.233 Archiving artifacts 00:10:43.312 [Pipeline] sh 00:10:43.595 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:43.609 [Pipeline] cleanWs 00:10:43.619 [WS-CLEANUP] Deleting project workspace... 00:10:43.619 [WS-CLEANUP] Deferred wipeout is used... 00:10:43.626 [WS-CLEANUP] done 00:10:43.628 [Pipeline] } 00:10:43.648 [Pipeline] // catchError 00:10:43.660 [Pipeline] sh 00:10:43.944 + logger -p user.info -t JENKINS-CI 00:10:43.953 [Pipeline] } 00:10:43.970 [Pipeline] // stage 00:10:43.976 [Pipeline] } 00:10:43.994 [Pipeline] // node 00:10:44.001 [Pipeline] End of Pipeline 00:10:44.040 Finished: SUCCESS