00:00:00.000 Started by upstream project "autotest-per-patch" build number 121010 00:00:00.000 originally caused by: 00:00:00.000 Started by user sys_sgci 00:00:00.032 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.033 The recommended git tool is: git 00:00:00.033 using credential 00000000-0000-0000-0000-000000000002 00:00:00.037 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.062 Fetching changes from the remote Git repository 00:00:00.064 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.098 Using shallow fetch with depth 1 00:00:00.098 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.098 > git --version # timeout=10 00:00:00.145 > git --version # 'git version 2.39.2' 00:00:00.145 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.146 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.146 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.644 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.655 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.671 Checking out Revision 6e1fadd1eee50389429f9abb33dde5face8ca717 (FETCH_HEAD) 00:00:03.671 > git config core.sparsecheckout # timeout=10 00:00:03.684 > git read-tree -mu HEAD # timeout=10 00:00:03.700 > git checkout -f 6e1fadd1eee50389429f9abb33dde5face8ca717 # timeout=5 00:00:03.720 Commit message: "pool: attach build logs for failed merge builds" 00:00:03.720 > git rev-list --no-walk 6e1fadd1eee50389429f9abb33dde5face8ca717 # timeout=10 00:00:03.827 [Pipeline] Start of Pipeline 00:00:03.839 [Pipeline] library 00:00:03.840 Loading library shm_lib@master 00:00:03.840 Library shm_lib@master is cached. Copying from home. 00:00:03.852 [Pipeline] node 00:00:03.861 Running on WFP39 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.863 [Pipeline] { 00:00:03.873 [Pipeline] catchError 00:00:03.875 [Pipeline] { 00:00:03.885 [Pipeline] wrap 00:00:03.892 [Pipeline] { 00:00:03.898 [Pipeline] stage 00:00:03.900 [Pipeline] { (Prologue) 00:00:04.102 [Pipeline] sh 00:00:04.380 + logger -p user.info -t JENKINS-CI 00:00:04.402 [Pipeline] echo 00:00:04.404 Node: WFP39 00:00:04.414 [Pipeline] sh 00:00:04.717 [Pipeline] setCustomBuildProperty 00:00:04.730 [Pipeline] echo 00:00:04.731 Cleanup processes 00:00:04.736 [Pipeline] sh 00:00:05.017 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.017 1494730 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.029 [Pipeline] sh 00:00:05.309 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.309 ++ grep -v 'sudo pgrep' 00:00:05.309 ++ awk '{print $1}' 00:00:05.309 + sudo kill -9 00:00:05.309 + true 00:00:05.321 [Pipeline] cleanWs 00:00:05.329 [WS-CLEANUP] Deleting project workspace... 00:00:05.329 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.335 [WS-CLEANUP] done 00:00:05.339 [Pipeline] setCustomBuildProperty 00:00:05.352 [Pipeline] sh 00:00:05.632 + sudo git config --global --replace-all safe.directory '*' 00:00:05.713 [Pipeline] nodesByLabel 00:00:05.715 Found a total of 1 nodes with the 'sorcerer' label 00:00:05.727 [Pipeline] httpRequest 00:00:05.731 HttpMethod: GET 00:00:05.731 URL: http://10.211.164.96/packages/jbp_6e1fadd1eee50389429f9abb33dde5face8ca717.tar.gz 00:00:05.734 Sending request to url: http://10.211.164.96/packages/jbp_6e1fadd1eee50389429f9abb33dde5face8ca717.tar.gz 00:00:05.736 Response Code: HTTP/1.1 200 OK 00:00:05.738 Success: Status code 200 is in the accepted range: 200,404 00:00:05.738 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_6e1fadd1eee50389429f9abb33dde5face8ca717.tar.gz 00:00:06.330 [Pipeline] sh 00:00:06.610 + tar --no-same-owner -xf jbp_6e1fadd1eee50389429f9abb33dde5face8ca717.tar.gz 00:00:06.623 [Pipeline] httpRequest 00:00:06.626 HttpMethod: GET 00:00:06.626 URL: http://10.211.164.96/packages/spdk_5c8d451f1a33f46d3d4261563c310dc4efcca339.tar.gz 00:00:06.627 Sending request to url: http://10.211.164.96/packages/spdk_5c8d451f1a33f46d3d4261563c310dc4efcca339.tar.gz 00:00:06.630 Response Code: HTTP/1.1 200 OK 00:00:06.630 Success: Status code 200 is in the accepted range: 200,404 00:00:06.630 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_5c8d451f1a33f46d3d4261563c310dc4efcca339.tar.gz 00:00:26.663 [Pipeline] sh 00:00:26.943 + tar --no-same-owner -xf spdk_5c8d451f1a33f46d3d4261563c310dc4efcca339.tar.gz 00:00:29.488 [Pipeline] sh 00:00:29.771 + git -C spdk log --oneline -n5 00:00:29.771 5c8d451f1 app/trace: emit owner descriptions 00:00:29.771 aaaef7578 trace: rename trace_event's poller_id to owner_id 00:00:29.771 98cccbebd trace: add concept of "owner" to trace files 00:00:29.771 bf2cbb6d8 trace: rename "per_lcore_history" to just "data" 00:00:29.771 035bc63a4 trace: add trace_flags_fini() 00:00:29.785 [Pipeline] } 00:00:29.804 [Pipeline] // stage 00:00:29.813 [Pipeline] stage 00:00:29.815 [Pipeline] { (Prepare) 00:00:29.833 [Pipeline] writeFile 00:00:29.850 [Pipeline] sh 00:00:30.166 + logger -p user.info -t JENKINS-CI 00:00:30.178 [Pipeline] sh 00:00:30.457 + logger -p user.info -t JENKINS-CI 00:00:30.470 [Pipeline] sh 00:00:30.750 + cat autorun-spdk.conf 00:00:30.750 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:30.750 SPDK_TEST_FUZZER_SHORT=1 00:00:30.750 SPDK_TEST_FUZZER=1 00:00:30.750 SPDK_RUN_UBSAN=1 00:00:30.757 RUN_NIGHTLY=0 00:00:30.762 [Pipeline] readFile 00:00:30.787 [Pipeline] withEnv 00:00:30.789 [Pipeline] { 00:00:30.805 [Pipeline] sh 00:00:31.086 + set -ex 00:00:31.086 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:31.086 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:31.086 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:31.086 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:31.086 ++ SPDK_TEST_FUZZER=1 00:00:31.086 ++ SPDK_RUN_UBSAN=1 00:00:31.086 ++ RUN_NIGHTLY=0 00:00:31.086 + case $SPDK_TEST_NVMF_NICS in 00:00:31.086 + DRIVERS= 00:00:31.086 + [[ -n '' ]] 00:00:31.086 + exit 0 00:00:31.094 [Pipeline] } 00:00:31.113 [Pipeline] // withEnv 00:00:31.118 [Pipeline] } 00:00:31.133 [Pipeline] // stage 00:00:31.142 [Pipeline] catchError 00:00:31.143 [Pipeline] { 00:00:31.157 [Pipeline] timeout 00:00:31.157 Timeout set to expire in 30 min 00:00:31.159 [Pipeline] { 00:00:31.174 [Pipeline] stage 00:00:31.176 [Pipeline] { (Tests) 00:00:31.192 [Pipeline] sh 00:00:31.472 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:31.472 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:31.472 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:31.472 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:31.472 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:31.472 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:31.472 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:31.472 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:31.472 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:31.472 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:31.472 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:31.472 + source /etc/os-release 00:00:31.472 ++ NAME='Fedora Linux' 00:00:31.472 ++ VERSION='38 (Cloud Edition)' 00:00:31.472 ++ ID=fedora 00:00:31.472 ++ VERSION_ID=38 00:00:31.472 ++ VERSION_CODENAME= 00:00:31.472 ++ PLATFORM_ID=platform:f38 00:00:31.472 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:31.472 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:31.472 ++ LOGO=fedora-logo-icon 00:00:31.472 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:31.472 ++ HOME_URL=https://fedoraproject.org/ 00:00:31.472 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:31.472 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:31.472 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:31.472 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:31.472 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:31.472 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:31.472 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:31.472 ++ SUPPORT_END=2024-05-14 00:00:31.472 ++ VARIANT='Cloud Edition' 00:00:31.472 ++ VARIANT_ID=cloud 00:00:31.472 + uname -a 00:00:31.472 Linux spdk-wfp-39 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:00:31.472 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:34.760 Hugepages 00:00:34.760 node hugesize free / total 00:00:34.760 node0 1048576kB 0 / 0 00:00:34.760 node0 2048kB 0 / 0 00:00:34.760 node1 1048576kB 0 / 0 00:00:34.760 node1 2048kB 0 / 0 00:00:34.760 00:00:34.760 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:34.760 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:34.760 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:34.760 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:34.760 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:34.760 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:34.760 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:34.760 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:34.760 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:34.760 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:00:34.760 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:34.760 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:34.760 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:34.760 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:34.760 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:34.760 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:34.760 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:34.760 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:34.760 + rm -f /tmp/spdk-ld-path 00:00:34.760 + source autorun-spdk.conf 00:00:34.760 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:34.760 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:34.760 ++ SPDK_TEST_FUZZER=1 00:00:34.760 ++ SPDK_RUN_UBSAN=1 00:00:34.760 ++ RUN_NIGHTLY=0 00:00:34.760 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:34.760 + [[ -n '' ]] 00:00:34.761 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:34.761 + for M in /var/spdk/build-*-manifest.txt 00:00:34.761 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:34.761 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:34.761 + for M in /var/spdk/build-*-manifest.txt 00:00:34.761 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:34.761 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:34.761 ++ uname 00:00:34.761 + [[ Linux == \L\i\n\u\x ]] 00:00:34.761 + sudo dmesg -T 00:00:34.761 + sudo dmesg --clear 00:00:34.761 + dmesg_pid=1495660 00:00:34.761 + sudo dmesg -Tw 00:00:34.761 + [[ Fedora Linux == FreeBSD ]] 00:00:34.761 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:34.761 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:34.761 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:34.761 + [[ -x /usr/src/fio-static/fio ]] 00:00:34.761 + export FIO_BIN=/usr/src/fio-static/fio 00:00:34.761 + FIO_BIN=/usr/src/fio-static/fio 00:00:34.761 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:34.761 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:34.761 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:34.761 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:34.761 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:34.761 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:34.761 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:34.761 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:34.761 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:35.020 Test configuration: 00:00:35.020 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:35.020 SPDK_TEST_FUZZER_SHORT=1 00:00:35.020 SPDK_TEST_FUZZER=1 00:00:35.020 SPDK_RUN_UBSAN=1 00:00:35.020 RUN_NIGHTLY=0 19:07:21 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:35.020 19:07:21 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:35.020 19:07:21 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:35.020 19:07:21 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:35.020 19:07:21 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:35.020 19:07:21 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:35.020 19:07:21 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:35.020 19:07:21 -- paths/export.sh@5 -- $ export PATH 00:00:35.020 19:07:21 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:35.020 19:07:21 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:35.020 19:07:21 -- common/autobuild_common.sh@435 -- $ date +%s 00:00:35.020 19:07:21 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1713978441.XXXXXX 00:00:35.020 19:07:21 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1713978441.Wkq6uh 00:00:35.020 19:07:21 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:00:35.020 19:07:21 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:00:35.020 19:07:21 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:00:35.020 19:07:21 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:35.020 19:07:21 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:35.020 19:07:21 -- common/autobuild_common.sh@451 -- $ get_config_params 00:00:35.020 19:07:21 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:00:35.020 19:07:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:35.020 19:07:21 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:00:35.020 19:07:21 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:00:35.020 19:07:21 -- pm/common@17 -- $ local monitor 00:00:35.020 19:07:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:35.020 19:07:21 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1495696 00:00:35.020 19:07:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:35.020 19:07:21 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1495697 00:00:35.020 19:07:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:35.020 19:07:21 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1495699 00:00:35.020 19:07:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:35.020 19:07:21 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1495701 00:00:35.020 19:07:21 -- pm/common@26 -- $ sleep 1 00:00:35.020 19:07:21 -- pm/common@21 -- $ date +%s 00:00:35.020 19:07:21 -- pm/common@21 -- $ date +%s 00:00:35.020 19:07:21 -- pm/common@21 -- $ date +%s 00:00:35.020 19:07:21 -- pm/common@21 -- $ date +%s 00:00:35.020 19:07:21 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713978441 00:00:35.020 19:07:21 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713978441 00:00:35.020 19:07:21 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713978441 00:00:35.020 19:07:21 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1713978441 00:00:35.020 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713978441_collect-cpu-load.pm.log 00:00:35.020 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713978441_collect-bmc-pm.bmc.pm.log 00:00:35.020 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713978441_collect-vmstat.pm.log 00:00:35.020 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1713978441_collect-cpu-temp.pm.log 00:00:35.958 19:07:22 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:00:35.958 19:07:22 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:35.958 19:07:22 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:35.958 19:07:22 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:35.958 19:07:22 -- spdk/autobuild.sh@16 -- $ date -u 00:00:35.958 Wed Apr 24 05:07:22 PM UTC 2024 00:00:35.958 19:07:22 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:35.958 v24.05-pre-442-g5c8d451f1 00:00:35.958 19:07:22 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:35.958 19:07:22 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:35.958 19:07:22 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:35.958 19:07:22 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:00:35.958 19:07:22 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:35.958 19:07:22 -- common/autotest_common.sh@10 -- $ set +x 00:00:36.217 ************************************ 00:00:36.217 START TEST ubsan 00:00:36.217 ************************************ 00:00:36.217 19:07:23 -- common/autotest_common.sh@1111 -- $ echo 'using ubsan' 00:00:36.217 using ubsan 00:00:36.217 00:00:36.217 real 0m0.000s 00:00:36.217 user 0m0.000s 00:00:36.217 sys 0m0.000s 00:00:36.217 19:07:23 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:00:36.217 19:07:23 -- common/autotest_common.sh@10 -- $ set +x 00:00:36.217 ************************************ 00:00:36.217 END TEST ubsan 00:00:36.217 ************************************ 00:00:36.217 19:07:23 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:36.217 19:07:23 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:36.217 19:07:23 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:36.217 19:07:23 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:00:36.217 19:07:23 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:00:36.217 19:07:23 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:00:36.217 19:07:23 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:00:36.217 19:07:23 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:36.217 19:07:23 -- common/autotest_common.sh@10 -- $ set +x 00:00:36.479 ************************************ 00:00:36.479 START TEST autobuild_llvm_precompile 00:00:36.479 ************************************ 00:00:36.479 19:07:23 -- common/autotest_common.sh@1111 -- $ _llvm_precompile 00:00:36.479 19:07:23 -- common/autobuild_common.sh@32 -- $ clang --version 00:00:36.479 19:07:23 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:00:36.479 Target: x86_64-redhat-linux-gnu 00:00:36.479 Thread model: posix 00:00:36.479 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:00:36.479 19:07:23 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:00:36.479 19:07:23 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:00:36.479 19:07:23 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:00:36.479 19:07:23 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:00:36.479 19:07:23 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:00:36.479 19:07:23 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:00:36.479 19:07:23 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:36.479 19:07:23 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:00:36.479 19:07:23 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:00:36.479 19:07:23 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:36.737 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:00:36.737 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:00:37.002 Using 'verbs' RDMA provider 00:00:53.264 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:05.483 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:05.483 Creating mk/config.mk...done. 00:01:05.483 Creating mk/cc.flags.mk...done. 00:01:05.483 Type 'make' to build. 00:01:05.483 00:01:05.483 real 0m29.142s 00:01:05.483 user 0m12.709s 00:01:05.483 sys 0m15.778s 00:01:05.483 19:07:52 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:01:05.483 19:07:52 -- common/autotest_common.sh@10 -- $ set +x 00:01:05.483 ************************************ 00:01:05.483 END TEST autobuild_llvm_precompile 00:01:05.483 ************************************ 00:01:05.483 19:07:52 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:05.483 19:07:52 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:05.483 19:07:52 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:05.483 19:07:52 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:05.483 19:07:52 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:05.743 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:05.743 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:06.312 Using 'verbs' RDMA provider 00:01:19.458 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:31.670 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:31.670 Creating mk/config.mk...done. 00:01:31.670 Creating mk/cc.flags.mk...done. 00:01:31.670 Type 'make' to build. 00:01:31.670 19:08:17 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:31.670 19:08:17 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:31.670 19:08:17 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:31.670 19:08:17 -- common/autotest_common.sh@10 -- $ set +x 00:01:31.670 ************************************ 00:01:31.670 START TEST make 00:01:31.670 ************************************ 00:01:31.670 19:08:17 -- common/autotest_common.sh@1111 -- $ make -j72 00:01:31.670 make[1]: Nothing to be done for 'all'. 00:01:32.606 The Meson build system 00:01:32.606 Version: 1.3.1 00:01:32.606 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:01:32.606 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:32.606 Build type: native build 00:01:32.606 Project name: libvfio-user 00:01:32.606 Project version: 0.0.1 00:01:32.606 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:32.606 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:32.606 Host machine cpu family: x86_64 00:01:32.606 Host machine cpu: x86_64 00:01:32.606 Run-time dependency threads found: YES 00:01:32.606 Library dl found: YES 00:01:32.606 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:32.606 Run-time dependency json-c found: YES 0.17 00:01:32.606 Run-time dependency cmocka found: YES 1.1.7 00:01:32.606 Program pytest-3 found: NO 00:01:32.606 Program flake8 found: NO 00:01:32.606 Program misspell-fixer found: NO 00:01:32.606 Program restructuredtext-lint found: NO 00:01:32.606 Program valgrind found: YES (/usr/bin/valgrind) 00:01:32.606 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:32.606 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:32.606 Compiler for C supports arguments -Wwrite-strings: YES 00:01:32.606 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:32.606 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:01:32.606 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:01:32.606 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:32.606 Build targets in project: 8 00:01:32.606 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:01:32.606 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:01:32.606 00:01:32.606 libvfio-user 0.0.1 00:01:32.606 00:01:32.606 User defined options 00:01:32.606 buildtype : debug 00:01:32.607 default_library: static 00:01:32.607 libdir : /usr/local/lib 00:01:32.607 00:01:32.607 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:32.864 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:33.122 [1/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:01:33.122 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:01:33.122 [3/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:01:33.122 [4/36] Compiling C object samples/null.p/null.c.o 00:01:33.122 [5/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:01:33.122 [6/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:01:33.122 [7/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:01:33.122 [8/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:01:33.122 [9/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:01:33.122 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:01:33.122 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:01:33.122 [12/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:01:33.122 [13/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:01:33.122 [14/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:01:33.122 [15/36] Compiling C object test/unit_tests.p/mocks.c.o 00:01:33.122 [16/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:01:33.122 [17/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:01:33.122 [18/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:01:33.122 [19/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:01:33.122 [20/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:01:33.122 [21/36] Compiling C object samples/client.p/client.c.o 00:01:33.122 [22/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:01:33.122 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:01:33.122 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:01:33.122 [25/36] Compiling C object samples/server.p/server.c.o 00:01:33.122 [26/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:01:33.122 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:01:33.122 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:01:33.122 [29/36] Linking static target lib/libvfio-user.a 00:01:33.122 [30/36] Linking target samples/client 00:01:33.123 [31/36] Linking target samples/gpio-pci-idio-16 00:01:33.123 [32/36] Linking target samples/server 00:01:33.123 [33/36] Linking target test/unit_tests 00:01:33.123 [34/36] Linking target samples/shadow_ioeventfd_server 00:01:33.123 [35/36] Linking target samples/null 00:01:33.123 [36/36] Linking target samples/lspci 00:01:33.123 INFO: autodetecting backend as ninja 00:01:33.123 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:33.381 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:33.639 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:33.639 ninja: no work to do. 00:01:40.205 The Meson build system 00:01:40.205 Version: 1.3.1 00:01:40.205 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:01:40.205 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:01:40.205 Build type: native build 00:01:40.205 Program cat found: YES (/usr/bin/cat) 00:01:40.205 Project name: DPDK 00:01:40.205 Project version: 23.11.0 00:01:40.205 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:40.205 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:40.205 Host machine cpu family: x86_64 00:01:40.205 Host machine cpu: x86_64 00:01:40.205 Message: ## Building in Developer Mode ## 00:01:40.205 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:40.205 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:40.205 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:40.205 Program python3 found: YES (/usr/bin/python3) 00:01:40.205 Program cat found: YES (/usr/bin/cat) 00:01:40.205 Compiler for C supports arguments -march=native: YES 00:01:40.205 Checking for size of "void *" : 8 00:01:40.205 Checking for size of "void *" : 8 (cached) 00:01:40.205 Library m found: YES 00:01:40.205 Library numa found: YES 00:01:40.205 Has header "numaif.h" : YES 00:01:40.205 Library fdt found: NO 00:01:40.205 Library execinfo found: NO 00:01:40.205 Has header "execinfo.h" : YES 00:01:40.205 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:40.205 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:40.205 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:40.205 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:40.205 Run-time dependency openssl found: YES 3.0.9 00:01:40.205 Run-time dependency libpcap found: YES 1.10.4 00:01:40.205 Has header "pcap.h" with dependency libpcap: YES 00:01:40.205 Compiler for C supports arguments -Wcast-qual: YES 00:01:40.205 Compiler for C supports arguments -Wdeprecated: YES 00:01:40.205 Compiler for C supports arguments -Wformat: YES 00:01:40.205 Compiler for C supports arguments -Wformat-nonliteral: YES 00:01:40.205 Compiler for C supports arguments -Wformat-security: YES 00:01:40.205 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:40.205 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:40.205 Compiler for C supports arguments -Wnested-externs: YES 00:01:40.205 Compiler for C supports arguments -Wold-style-definition: YES 00:01:40.205 Compiler for C supports arguments -Wpointer-arith: YES 00:01:40.205 Compiler for C supports arguments -Wsign-compare: YES 00:01:40.205 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:40.205 Compiler for C supports arguments -Wundef: YES 00:01:40.205 Compiler for C supports arguments -Wwrite-strings: YES 00:01:40.205 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:40.205 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:01:40.205 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:40.205 Program objdump found: YES (/usr/bin/objdump) 00:01:40.205 Compiler for C supports arguments -mavx512f: YES 00:01:40.205 Checking if "AVX512 checking" compiles: YES 00:01:40.205 Fetching value of define "__SSE4_2__" : 1 00:01:40.205 Fetching value of define "__AES__" : 1 00:01:40.205 Fetching value of define "__AVX__" : 1 00:01:40.205 Fetching value of define "__AVX2__" : 1 00:01:40.205 Fetching value of define "__AVX512BW__" : 1 00:01:40.205 Fetching value of define "__AVX512CD__" : 1 00:01:40.205 Fetching value of define "__AVX512DQ__" : 1 00:01:40.205 Fetching value of define "__AVX512F__" : 1 00:01:40.205 Fetching value of define "__AVX512VL__" : 1 00:01:40.205 Fetching value of define "__PCLMUL__" : 1 00:01:40.205 Fetching value of define "__RDRND__" : 1 00:01:40.205 Fetching value of define "__RDSEED__" : 1 00:01:40.205 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:40.205 Fetching value of define "__znver1__" : (undefined) 00:01:40.205 Fetching value of define "__znver2__" : (undefined) 00:01:40.205 Fetching value of define "__znver3__" : (undefined) 00:01:40.205 Fetching value of define "__znver4__" : (undefined) 00:01:40.205 Compiler for C supports arguments -Wno-format-truncation: NO 00:01:40.205 Message: lib/log: Defining dependency "log" 00:01:40.205 Message: lib/kvargs: Defining dependency "kvargs" 00:01:40.205 Message: lib/telemetry: Defining dependency "telemetry" 00:01:40.205 Checking for function "getentropy" : NO 00:01:40.205 Message: lib/eal: Defining dependency "eal" 00:01:40.205 Message: lib/ring: Defining dependency "ring" 00:01:40.205 Message: lib/rcu: Defining dependency "rcu" 00:01:40.205 Message: lib/mempool: Defining dependency "mempool" 00:01:40.205 Message: lib/mbuf: Defining dependency "mbuf" 00:01:40.205 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:40.205 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:40.205 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:40.205 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:40.205 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:40.205 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:40.205 Compiler for C supports arguments -mpclmul: YES 00:01:40.205 Compiler for C supports arguments -maes: YES 00:01:40.205 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:40.205 Compiler for C supports arguments -mavx512bw: YES 00:01:40.205 Compiler for C supports arguments -mavx512dq: YES 00:01:40.205 Compiler for C supports arguments -mavx512vl: YES 00:01:40.205 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:40.205 Compiler for C supports arguments -mavx2: YES 00:01:40.205 Compiler for C supports arguments -mavx: YES 00:01:40.205 Message: lib/net: Defining dependency "net" 00:01:40.205 Message: lib/meter: Defining dependency "meter" 00:01:40.205 Message: lib/ethdev: Defining dependency "ethdev" 00:01:40.205 Message: lib/pci: Defining dependency "pci" 00:01:40.205 Message: lib/cmdline: Defining dependency "cmdline" 00:01:40.205 Message: lib/hash: Defining dependency "hash" 00:01:40.205 Message: lib/timer: Defining dependency "timer" 00:01:40.205 Message: lib/compressdev: Defining dependency "compressdev" 00:01:40.205 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:40.205 Message: lib/dmadev: Defining dependency "dmadev" 00:01:40.205 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:40.205 Message: lib/power: Defining dependency "power" 00:01:40.205 Message: lib/reorder: Defining dependency "reorder" 00:01:40.205 Message: lib/security: Defining dependency "security" 00:01:40.205 Has header "linux/userfaultfd.h" : YES 00:01:40.205 Has header "linux/vduse.h" : YES 00:01:40.205 Message: lib/vhost: Defining dependency "vhost" 00:01:40.205 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:01:40.205 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:40.205 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:40.205 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:40.205 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:40.205 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:40.205 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:40.205 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:40.205 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:40.205 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:40.205 Program doxygen found: YES (/usr/bin/doxygen) 00:01:40.205 Configuring doxy-api-html.conf using configuration 00:01:40.205 Configuring doxy-api-man.conf using configuration 00:01:40.205 Program mandb found: YES (/usr/bin/mandb) 00:01:40.205 Program sphinx-build found: NO 00:01:40.205 Configuring rte_build_config.h using configuration 00:01:40.205 Message: 00:01:40.205 ================= 00:01:40.205 Applications Enabled 00:01:40.205 ================= 00:01:40.205 00:01:40.205 apps: 00:01:40.205 00:01:40.205 00:01:40.205 Message: 00:01:40.205 ================= 00:01:40.205 Libraries Enabled 00:01:40.205 ================= 00:01:40.205 00:01:40.205 libs: 00:01:40.205 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:40.205 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:40.205 cryptodev, dmadev, power, reorder, security, vhost, 00:01:40.205 00:01:40.205 Message: 00:01:40.205 =============== 00:01:40.205 Drivers Enabled 00:01:40.205 =============== 00:01:40.205 00:01:40.205 common: 00:01:40.205 00:01:40.205 bus: 00:01:40.205 pci, vdev, 00:01:40.205 mempool: 00:01:40.205 ring, 00:01:40.205 dma: 00:01:40.205 00:01:40.205 net: 00:01:40.205 00:01:40.205 crypto: 00:01:40.205 00:01:40.205 compress: 00:01:40.205 00:01:40.205 vdpa: 00:01:40.205 00:01:40.205 00:01:40.205 Message: 00:01:40.205 ================= 00:01:40.205 Content Skipped 00:01:40.205 ================= 00:01:40.205 00:01:40.205 apps: 00:01:40.205 dumpcap: explicitly disabled via build config 00:01:40.205 graph: explicitly disabled via build config 00:01:40.206 pdump: explicitly disabled via build config 00:01:40.206 proc-info: explicitly disabled via build config 00:01:40.206 test-acl: explicitly disabled via build config 00:01:40.206 test-bbdev: explicitly disabled via build config 00:01:40.206 test-cmdline: explicitly disabled via build config 00:01:40.206 test-compress-perf: explicitly disabled via build config 00:01:40.206 test-crypto-perf: explicitly disabled via build config 00:01:40.206 test-dma-perf: explicitly disabled via build config 00:01:40.206 test-eventdev: explicitly disabled via build config 00:01:40.206 test-fib: explicitly disabled via build config 00:01:40.206 test-flow-perf: explicitly disabled via build config 00:01:40.206 test-gpudev: explicitly disabled via build config 00:01:40.206 test-mldev: explicitly disabled via build config 00:01:40.206 test-pipeline: explicitly disabled via build config 00:01:40.206 test-pmd: explicitly disabled via build config 00:01:40.206 test-regex: explicitly disabled via build config 00:01:40.206 test-sad: explicitly disabled via build config 00:01:40.206 test-security-perf: explicitly disabled via build config 00:01:40.206 00:01:40.206 libs: 00:01:40.206 metrics: explicitly disabled via build config 00:01:40.206 acl: explicitly disabled via build config 00:01:40.206 bbdev: explicitly disabled via build config 00:01:40.206 bitratestats: explicitly disabled via build config 00:01:40.206 bpf: explicitly disabled via build config 00:01:40.206 cfgfile: explicitly disabled via build config 00:01:40.206 distributor: explicitly disabled via build config 00:01:40.206 efd: explicitly disabled via build config 00:01:40.206 eventdev: explicitly disabled via build config 00:01:40.206 dispatcher: explicitly disabled via build config 00:01:40.206 gpudev: explicitly disabled via build config 00:01:40.206 gro: explicitly disabled via build config 00:01:40.206 gso: explicitly disabled via build config 00:01:40.206 ip_frag: explicitly disabled via build config 00:01:40.206 jobstats: explicitly disabled via build config 00:01:40.206 latencystats: explicitly disabled via build config 00:01:40.206 lpm: explicitly disabled via build config 00:01:40.206 member: explicitly disabled via build config 00:01:40.206 pcapng: explicitly disabled via build config 00:01:40.206 rawdev: explicitly disabled via build config 00:01:40.206 regexdev: explicitly disabled via build config 00:01:40.206 mldev: explicitly disabled via build config 00:01:40.206 rib: explicitly disabled via build config 00:01:40.206 sched: explicitly disabled via build config 00:01:40.206 stack: explicitly disabled via build config 00:01:40.206 ipsec: explicitly disabled via build config 00:01:40.206 pdcp: explicitly disabled via build config 00:01:40.206 fib: explicitly disabled via build config 00:01:40.206 port: explicitly disabled via build config 00:01:40.206 pdump: explicitly disabled via build config 00:01:40.206 table: explicitly disabled via build config 00:01:40.206 pipeline: explicitly disabled via build config 00:01:40.206 graph: explicitly disabled via build config 00:01:40.206 node: explicitly disabled via build config 00:01:40.206 00:01:40.206 drivers: 00:01:40.206 common/cpt: not in enabled drivers build config 00:01:40.206 common/dpaax: not in enabled drivers build config 00:01:40.206 common/iavf: not in enabled drivers build config 00:01:40.206 common/idpf: not in enabled drivers build config 00:01:40.206 common/mvep: not in enabled drivers build config 00:01:40.206 common/octeontx: not in enabled drivers build config 00:01:40.206 bus/auxiliary: not in enabled drivers build config 00:01:40.206 bus/cdx: not in enabled drivers build config 00:01:40.206 bus/dpaa: not in enabled drivers build config 00:01:40.206 bus/fslmc: not in enabled drivers build config 00:01:40.206 bus/ifpga: not in enabled drivers build config 00:01:40.206 bus/platform: not in enabled drivers build config 00:01:40.206 bus/vmbus: not in enabled drivers build config 00:01:40.206 common/cnxk: not in enabled drivers build config 00:01:40.206 common/mlx5: not in enabled drivers build config 00:01:40.206 common/nfp: not in enabled drivers build config 00:01:40.206 common/qat: not in enabled drivers build config 00:01:40.206 common/sfc_efx: not in enabled drivers build config 00:01:40.206 mempool/bucket: not in enabled drivers build config 00:01:40.206 mempool/cnxk: not in enabled drivers build config 00:01:40.206 mempool/dpaa: not in enabled drivers build config 00:01:40.206 mempool/dpaa2: not in enabled drivers build config 00:01:40.206 mempool/octeontx: not in enabled drivers build config 00:01:40.206 mempool/stack: not in enabled drivers build config 00:01:40.206 dma/cnxk: not in enabled drivers build config 00:01:40.206 dma/dpaa: not in enabled drivers build config 00:01:40.206 dma/dpaa2: not in enabled drivers build config 00:01:40.206 dma/hisilicon: not in enabled drivers build config 00:01:40.206 dma/idxd: not in enabled drivers build config 00:01:40.206 dma/ioat: not in enabled drivers build config 00:01:40.206 dma/skeleton: not in enabled drivers build config 00:01:40.206 net/af_packet: not in enabled drivers build config 00:01:40.206 net/af_xdp: not in enabled drivers build config 00:01:40.206 net/ark: not in enabled drivers build config 00:01:40.206 net/atlantic: not in enabled drivers build config 00:01:40.206 net/avp: not in enabled drivers build config 00:01:40.206 net/axgbe: not in enabled drivers build config 00:01:40.206 net/bnx2x: not in enabled drivers build config 00:01:40.206 net/bnxt: not in enabled drivers build config 00:01:40.206 net/bonding: not in enabled drivers build config 00:01:40.206 net/cnxk: not in enabled drivers build config 00:01:40.206 net/cpfl: not in enabled drivers build config 00:01:40.206 net/cxgbe: not in enabled drivers build config 00:01:40.206 net/dpaa: not in enabled drivers build config 00:01:40.206 net/dpaa2: not in enabled drivers build config 00:01:40.206 net/e1000: not in enabled drivers build config 00:01:40.206 net/ena: not in enabled drivers build config 00:01:40.206 net/enetc: not in enabled drivers build config 00:01:40.206 net/enetfec: not in enabled drivers build config 00:01:40.206 net/enic: not in enabled drivers build config 00:01:40.206 net/failsafe: not in enabled drivers build config 00:01:40.206 net/fm10k: not in enabled drivers build config 00:01:40.206 net/gve: not in enabled drivers build config 00:01:40.206 net/hinic: not in enabled drivers build config 00:01:40.206 net/hns3: not in enabled drivers build config 00:01:40.206 net/i40e: not in enabled drivers build config 00:01:40.206 net/iavf: not in enabled drivers build config 00:01:40.206 net/ice: not in enabled drivers build config 00:01:40.206 net/idpf: not in enabled drivers build config 00:01:40.206 net/igc: not in enabled drivers build config 00:01:40.206 net/ionic: not in enabled drivers build config 00:01:40.206 net/ipn3ke: not in enabled drivers build config 00:01:40.206 net/ixgbe: not in enabled drivers build config 00:01:40.206 net/mana: not in enabled drivers build config 00:01:40.206 net/memif: not in enabled drivers build config 00:01:40.206 net/mlx4: not in enabled drivers build config 00:01:40.206 net/mlx5: not in enabled drivers build config 00:01:40.206 net/mvneta: not in enabled drivers build config 00:01:40.206 net/mvpp2: not in enabled drivers build config 00:01:40.206 net/netvsc: not in enabled drivers build config 00:01:40.206 net/nfb: not in enabled drivers build config 00:01:40.206 net/nfp: not in enabled drivers build config 00:01:40.206 net/ngbe: not in enabled drivers build config 00:01:40.206 net/null: not in enabled drivers build config 00:01:40.206 net/octeontx: not in enabled drivers build config 00:01:40.206 net/octeon_ep: not in enabled drivers build config 00:01:40.206 net/pcap: not in enabled drivers build config 00:01:40.206 net/pfe: not in enabled drivers build config 00:01:40.206 net/qede: not in enabled drivers build config 00:01:40.206 net/ring: not in enabled drivers build config 00:01:40.206 net/sfc: not in enabled drivers build config 00:01:40.206 net/softnic: not in enabled drivers build config 00:01:40.206 net/tap: not in enabled drivers build config 00:01:40.206 net/thunderx: not in enabled drivers build config 00:01:40.206 net/txgbe: not in enabled drivers build config 00:01:40.206 net/vdev_netvsc: not in enabled drivers build config 00:01:40.206 net/vhost: not in enabled drivers build config 00:01:40.206 net/virtio: not in enabled drivers build config 00:01:40.206 net/vmxnet3: not in enabled drivers build config 00:01:40.206 raw/*: missing internal dependency, "rawdev" 00:01:40.206 crypto/armv8: not in enabled drivers build config 00:01:40.206 crypto/bcmfs: not in enabled drivers build config 00:01:40.206 crypto/caam_jr: not in enabled drivers build config 00:01:40.206 crypto/ccp: not in enabled drivers build config 00:01:40.206 crypto/cnxk: not in enabled drivers build config 00:01:40.206 crypto/dpaa_sec: not in enabled drivers build config 00:01:40.206 crypto/dpaa2_sec: not in enabled drivers build config 00:01:40.206 crypto/ipsec_mb: not in enabled drivers build config 00:01:40.206 crypto/mlx5: not in enabled drivers build config 00:01:40.206 crypto/mvsam: not in enabled drivers build config 00:01:40.206 crypto/nitrox: not in enabled drivers build config 00:01:40.206 crypto/null: not in enabled drivers build config 00:01:40.206 crypto/octeontx: not in enabled drivers build config 00:01:40.206 crypto/openssl: not in enabled drivers build config 00:01:40.206 crypto/scheduler: not in enabled drivers build config 00:01:40.206 crypto/uadk: not in enabled drivers build config 00:01:40.206 crypto/virtio: not in enabled drivers build config 00:01:40.206 compress/isal: not in enabled drivers build config 00:01:40.206 compress/mlx5: not in enabled drivers build config 00:01:40.206 compress/octeontx: not in enabled drivers build config 00:01:40.206 compress/zlib: not in enabled drivers build config 00:01:40.206 regex/*: missing internal dependency, "regexdev" 00:01:40.206 ml/*: missing internal dependency, "mldev" 00:01:40.206 vdpa/ifc: not in enabled drivers build config 00:01:40.206 vdpa/mlx5: not in enabled drivers build config 00:01:40.206 vdpa/nfp: not in enabled drivers build config 00:01:40.206 vdpa/sfc: not in enabled drivers build config 00:01:40.206 event/*: missing internal dependency, "eventdev" 00:01:40.206 baseband/*: missing internal dependency, "bbdev" 00:01:40.206 gpu/*: missing internal dependency, "gpudev" 00:01:40.206 00:01:40.206 00:01:40.206 Build targets in project: 85 00:01:40.206 00:01:40.206 DPDK 23.11.0 00:01:40.206 00:01:40.206 User defined options 00:01:40.207 buildtype : debug 00:01:40.207 default_library : static 00:01:40.207 libdir : lib 00:01:40.207 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:40.207 c_args : -fPIC -Werror 00:01:40.207 c_link_args : 00:01:40.207 cpu_instruction_set: native 00:01:40.207 disable_apps : test-acl,test-bbdev,test-crypto-perf,test-fib,test-pipeline,test-gpudev,test-flow-perf,pdump,dumpcap,test-sad,test-cmdline,test-eventdev,proc-info,test,test-dma-perf,test-pmd,test-mldev,test-compress-perf,test-security-perf,graph,test-regex 00:01:40.207 disable_libs : pipeline,member,eventdev,efd,bbdev,cfgfile,rib,sched,mldev,metrics,lpm,latencystats,pdump,pdcp,bpf,ipsec,fib,ip_frag,table,port,stack,gro,jobstats,regexdev,rawdev,pcapng,dispatcher,node,bitratestats,acl,gpudev,distributor,graph,gso 00:01:40.207 enable_docs : false 00:01:40.207 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:40.207 enable_kmods : false 00:01:40.207 tests : false 00:01:40.207 00:01:40.207 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:40.207 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:01:40.207 [1/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:40.207 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:40.207 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:40.207 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:40.207 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:40.207 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:40.207 [7/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:40.207 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:40.207 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:40.207 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:40.207 [11/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:40.207 [12/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:40.207 [13/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:40.207 [14/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:40.207 [15/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:40.207 [16/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:40.207 [17/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:40.207 [18/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:40.207 [19/265] Linking static target lib/librte_kvargs.a 00:01:40.207 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:40.207 [21/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:40.207 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:40.207 [23/265] Linking static target lib/librte_log.a 00:01:40.207 [24/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:40.207 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:40.464 [26/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.464 [27/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:40.464 [28/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:40.464 [29/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:40.464 [30/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:40.464 [31/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:40.464 [32/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:40.464 [33/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:40.464 [34/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:40.464 [35/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:40.464 [36/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:40.464 [37/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:40.464 [38/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:40.464 [39/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:40.464 [40/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:40.464 [41/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:40.465 [42/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:40.465 [43/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:40.465 [44/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:40.465 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:40.465 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:40.465 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:40.465 [48/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:40.465 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:40.465 [50/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:40.465 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:40.465 [52/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:40.465 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:40.465 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:40.465 [55/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:40.465 [56/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:40.465 [57/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:40.465 [58/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:40.465 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:40.465 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:40.465 [61/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:40.465 [62/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:40.465 [63/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:40.465 [64/265] Linking static target lib/librte_telemetry.a 00:01:40.465 [65/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:40.465 [66/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:40.465 [67/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:40.465 [68/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:40.465 [69/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:40.465 [70/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:40.465 [71/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:40.465 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:40.465 [73/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:40.465 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:40.465 [75/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:40.465 [76/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:40.465 [77/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:40.465 [78/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:40.465 [79/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:40.465 [80/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:40.465 [81/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:40.465 [82/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:40.465 [83/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:40.465 [84/265] Linking static target lib/librte_ring.a 00:01:40.465 [85/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:40.465 [86/265] Linking static target lib/librte_pci.a 00:01:40.465 [87/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:40.465 [88/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:40.726 [89/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:40.726 [90/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:40.726 [91/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:40.726 [92/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:40.726 [93/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:40.726 [94/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:40.726 [95/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:40.726 [96/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:40.726 [97/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:40.726 [98/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:40.726 [99/265] Linking static target lib/librte_meter.a 00:01:40.726 [100/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:40.726 [101/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:40.726 [102/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:40.726 [103/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:40.726 [104/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:40.726 [105/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:40.726 [106/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:40.726 [107/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:40.726 [108/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.726 [109/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:40.726 [110/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:40.726 [111/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:40.726 [112/265] Linking static target lib/librte_eal.a 00:01:40.726 [113/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:40.726 [114/265] Linking static target lib/librte_rcu.a 00:01:40.726 [115/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:40.726 [116/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:40.726 [117/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:40.726 [118/265] Linking static target lib/librte_mempool.a 00:01:40.726 [119/265] Linking static target lib/librte_net.a 00:01:40.726 [120/265] Linking target lib/librte_log.so.24.0 00:01:40.726 [121/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:40.726 [122/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:40.726 [123/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:40.726 [124/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.726 [125/265] Linking static target lib/librte_mbuf.a 00:01:40.985 [126/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.985 [127/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.985 [128/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:40.985 [129/265] Linking target lib/librte_kvargs.so.24.0 00:01:40.985 [130/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.985 [131/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.985 [132/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.985 [133/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:40.985 [134/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:40.985 [135/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:40.985 [136/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:40.985 [137/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:40.985 [138/265] Linking target lib/librte_telemetry.so.24.0 00:01:40.985 [139/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:40.985 [140/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:40.985 [141/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:40.985 [142/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:40.985 [143/265] Linking static target lib/librte_timer.a 00:01:40.985 [144/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:41.244 [145/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:41.244 [146/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:41.244 [147/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:41.244 [148/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:41.244 [149/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:41.244 [150/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:41.244 [151/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:41.244 [152/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:41.244 [153/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:41.244 [154/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:41.244 [155/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:41.244 [156/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:41.244 [157/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:41.244 [158/265] Linking static target lib/librte_cmdline.a 00:01:41.244 [159/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:41.244 [160/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:41.244 [161/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:41.244 [162/265] Linking static target lib/librte_compressdev.a 00:01:41.244 [163/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:41.244 [164/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:41.244 [165/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:41.244 [166/265] Linking static target lib/librte_dmadev.a 00:01:41.244 [167/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:41.244 [168/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:41.244 [169/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:41.244 [170/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:41.244 [171/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:41.244 [172/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:41.244 [173/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:41.244 [174/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:41.244 [175/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:41.244 [176/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:41.244 [177/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:41.244 [178/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:41.244 [179/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:41.244 [180/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:41.244 [181/265] Linking static target lib/librte_reorder.a 00:01:41.244 [182/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:41.244 [183/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:41.244 [184/265] Linking static target lib/librte_power.a 00:01:41.244 [185/265] Linking static target lib/librte_security.a 00:01:41.244 [186/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:41.244 [187/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:41.244 [188/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:41.244 [189/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:41.244 [190/265] Linking static target lib/librte_hash.a 00:01:41.244 [191/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:41.244 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:41.244 [193/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:41.244 [194/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:41.503 [195/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:41.503 [196/265] Linking static target lib/librte_cryptodev.a 00:01:41.503 [197/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:41.503 [198/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:41.503 [199/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:41.503 [200/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:41.503 [201/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.503 [202/265] Linking static target drivers/librte_bus_vdev.a 00:01:41.503 [203/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:41.503 [204/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:41.503 [205/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.503 [206/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.503 [207/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:41.503 [208/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:41.503 [209/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:41.503 [210/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:41.503 [211/265] Linking static target drivers/librte_mempool_ring.a 00:01:41.503 [212/265] Linking static target drivers/librte_bus_pci.a 00:01:41.760 [213/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:41.760 [214/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.760 [215/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:41.760 [216/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.760 [217/265] Linking static target lib/librte_ethdev.a 00:01:41.760 [218/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.760 [219/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.017 [220/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.275 [221/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.275 [222/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:42.275 [223/265] Linking static target lib/librte_vhost.a 00:01:42.275 [224/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.275 [225/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:42.532 [226/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:43.926 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.560 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.116 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.016 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.016 [231/265] Linking target lib/librte_eal.so.24.0 00:01:53.016 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:53.016 [233/265] Linking target lib/librte_meter.so.24.0 00:01:53.016 [234/265] Linking target lib/librte_ring.so.24.0 00:01:53.016 [235/265] Linking target drivers/librte_bus_vdev.so.24.0 00:01:53.016 [236/265] Linking target lib/librte_timer.so.24.0 00:01:53.016 [237/265] Linking target lib/librte_dmadev.so.24.0 00:01:53.016 [238/265] Linking target lib/librte_pci.so.24.0 00:01:53.275 [239/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:53.275 [240/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:53.275 [241/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:53.275 [242/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:53.275 [243/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:53.275 [244/265] Linking target lib/librte_rcu.so.24.0 00:01:53.275 [245/265] Linking target lib/librte_mempool.so.24.0 00:01:53.275 [246/265] Linking target drivers/librte_bus_pci.so.24.0 00:01:53.275 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:53.533 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:53.533 [249/265] Linking target drivers/librte_mempool_ring.so.24.0 00:01:53.533 [250/265] Linking target lib/librte_mbuf.so.24.0 00:01:53.533 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:53.792 [252/265] Linking target lib/librte_net.so.24.0 00:01:53.792 [253/265] Linking target lib/librte_cryptodev.so.24.0 00:01:53.792 [254/265] Linking target lib/librte_compressdev.so.24.0 00:01:53.792 [255/265] Linking target lib/librte_reorder.so.24.0 00:01:53.792 [256/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:53.792 [257/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:54.051 [258/265] Linking target lib/librte_security.so.24.0 00:01:54.051 [259/265] Linking target lib/librte_cmdline.so.24.0 00:01:54.051 [260/265] Linking target lib/librte_hash.so.24.0 00:01:54.051 [261/265] Linking target lib/librte_ethdev.so.24.0 00:01:54.051 [262/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:54.051 [263/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:54.051 [264/265] Linking target lib/librte_power.so.24.0 00:01:54.051 [265/265] Linking target lib/librte_vhost.so.24.0 00:01:54.051 INFO: autodetecting backend as ninja 00:01:54.051 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 72 00:01:54.987 CC lib/log/log.o 00:01:54.987 CC lib/log/log_flags.o 00:01:54.987 CC lib/log/log_deprecated.o 00:01:54.987 CC lib/ut_mock/mock.o 00:01:54.987 CC lib/ut/ut.o 00:01:55.245 LIB libspdk_log.a 00:01:55.245 LIB libspdk_ut_mock.a 00:01:55.245 LIB libspdk_ut.a 00:01:55.503 CXX lib/trace_parser/trace.o 00:01:55.503 CC lib/ioat/ioat.o 00:01:55.503 CC lib/dma/dma.o 00:01:55.503 CC lib/util/bit_array.o 00:01:55.503 CC lib/util/base64.o 00:01:55.503 CC lib/util/crc16.o 00:01:55.503 CC lib/util/cpuset.o 00:01:55.503 CC lib/util/crc32.o 00:01:55.503 CC lib/util/crc64.o 00:01:55.503 CC lib/util/crc32c.o 00:01:55.503 CC lib/util/crc32_ieee.o 00:01:55.503 CC lib/util/dif.o 00:01:55.503 CC lib/util/fd.o 00:01:55.503 CC lib/util/file.o 00:01:55.503 CC lib/util/hexlify.o 00:01:55.503 CC lib/util/iov.o 00:01:55.503 CC lib/util/math.o 00:01:55.503 CC lib/util/pipe.o 00:01:55.503 CC lib/util/strerror_tls.o 00:01:55.503 CC lib/util/string.o 00:01:55.503 CC lib/util/uuid.o 00:01:55.503 CC lib/util/fd_group.o 00:01:55.503 CC lib/util/xor.o 00:01:55.503 CC lib/util/zipf.o 00:01:55.503 CC lib/vfio_user/host/vfio_user_pci.o 00:01:55.503 CC lib/vfio_user/host/vfio_user.o 00:01:55.503 LIB libspdk_dma.a 00:01:55.762 LIB libspdk_ioat.a 00:01:55.762 LIB libspdk_vfio_user.a 00:01:55.762 LIB libspdk_util.a 00:01:56.021 LIB libspdk_trace_parser.a 00:01:56.021 CC lib/idxd/idxd.o 00:01:56.021 CC lib/idxd/idxd_user.o 00:01:56.021 CC lib/conf/conf.o 00:01:56.021 CC lib/json/json_parse.o 00:01:56.021 CC lib/json/json_write.o 00:01:56.021 CC lib/json/json_util.o 00:01:56.021 CC lib/vmd/vmd.o 00:01:56.021 CC lib/vmd/led.o 00:01:56.021 CC lib/rdma/common.o 00:01:56.021 CC lib/env_dpdk/env.o 00:01:56.021 CC lib/rdma/rdma_verbs.o 00:01:56.021 CC lib/env_dpdk/memory.o 00:01:56.021 CC lib/env_dpdk/pci.o 00:01:56.021 CC lib/env_dpdk/init.o 00:01:56.021 CC lib/env_dpdk/threads.o 00:01:56.021 CC lib/env_dpdk/pci_ioat.o 00:01:56.021 CC lib/env_dpdk/pci_virtio.o 00:01:56.280 CC lib/env_dpdk/pci_vmd.o 00:01:56.280 CC lib/env_dpdk/pci_idxd.o 00:01:56.280 CC lib/env_dpdk/pci_event.o 00:01:56.280 CC lib/env_dpdk/sigbus_handler.o 00:01:56.280 CC lib/env_dpdk/pci_dpdk.o 00:01:56.280 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:56.280 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:56.280 LIB libspdk_conf.a 00:01:56.280 LIB libspdk_json.a 00:01:56.280 LIB libspdk_rdma.a 00:01:56.540 LIB libspdk_idxd.a 00:01:56.540 LIB libspdk_vmd.a 00:01:56.540 CC lib/jsonrpc/jsonrpc_server.o 00:01:56.540 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:01:56.540 CC lib/jsonrpc/jsonrpc_client.o 00:01:56.540 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:01:56.800 LIB libspdk_jsonrpc.a 00:01:57.058 LIB libspdk_env_dpdk.a 00:01:57.058 CC lib/rpc/rpc.o 00:01:57.316 LIB libspdk_rpc.a 00:01:57.575 CC lib/notify/notify.o 00:01:57.575 CC lib/notify/notify_rpc.o 00:01:57.575 CC lib/keyring/keyring_rpc.o 00:01:57.575 CC lib/keyring/keyring.o 00:01:57.575 CC lib/trace/trace_flags.o 00:01:57.575 CC lib/trace/trace_rpc.o 00:01:57.575 CC lib/trace/trace.o 00:01:57.575 LIB libspdk_notify.a 00:01:57.575 LIB libspdk_keyring.a 00:01:57.833 LIB libspdk_trace.a 00:01:58.091 CC lib/sock/sock.o 00:01:58.091 CC lib/sock/sock_rpc.o 00:01:58.091 CC lib/thread/thread.o 00:01:58.091 CC lib/thread/iobuf.o 00:01:58.349 LIB libspdk_sock.a 00:01:58.607 CC lib/nvme/nvme_ctrlr_cmd.o 00:01:58.607 CC lib/nvme/nvme_ctrlr.o 00:01:58.607 CC lib/nvme/nvme_ns_cmd.o 00:01:58.607 CC lib/nvme/nvme_fabric.o 00:01:58.607 CC lib/nvme/nvme_ns.o 00:01:58.607 CC lib/nvme/nvme_pcie_common.o 00:01:58.607 CC lib/nvme/nvme_qpair.o 00:01:58.607 CC lib/nvme/nvme_pcie.o 00:01:58.607 CC lib/nvme/nvme.o 00:01:58.607 CC lib/nvme/nvme_quirks.o 00:01:58.607 CC lib/nvme/nvme_transport.o 00:01:58.607 CC lib/nvme/nvme_discovery.o 00:01:58.607 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:01:58.607 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:01:58.607 CC lib/nvme/nvme_tcp.o 00:01:58.607 CC lib/nvme/nvme_opal.o 00:01:58.607 CC lib/nvme/nvme_io_msg.o 00:01:58.607 CC lib/nvme/nvme_poll_group.o 00:01:58.607 CC lib/nvme/nvme_zns.o 00:01:58.607 CC lib/nvme/nvme_stubs.o 00:01:58.607 CC lib/nvme/nvme_auth.o 00:01:58.607 CC lib/nvme/nvme_cuse.o 00:01:58.607 CC lib/nvme/nvme_vfio_user.o 00:01:58.607 CC lib/nvme/nvme_rdma.o 00:01:58.866 LIB libspdk_thread.a 00:01:59.125 CC lib/accel/accel_rpc.o 00:01:59.125 CC lib/accel/accel.o 00:01:59.125 CC lib/accel/accel_sw.o 00:01:59.125 CC lib/init/json_config.o 00:01:59.125 CC lib/init/subsystem.o 00:01:59.125 CC lib/init/subsystem_rpc.o 00:01:59.125 CC lib/init/rpc.o 00:01:59.125 CC lib/blob/blobstore.o 00:01:59.125 CC lib/blob/zeroes.o 00:01:59.125 CC lib/blob/request.o 00:01:59.125 CC lib/vfu_tgt/tgt_endpoint.o 00:01:59.125 CC lib/blob/blob_bs_dev.o 00:01:59.125 CC lib/vfu_tgt/tgt_rpc.o 00:01:59.125 CC lib/virtio/virtio.o 00:01:59.125 CC lib/virtio/virtio_vhost_user.o 00:01:59.125 CC lib/virtio/virtio_vfio_user.o 00:01:59.125 CC lib/virtio/virtio_pci.o 00:01:59.383 LIB libspdk_init.a 00:01:59.383 LIB libspdk_virtio.a 00:01:59.383 LIB libspdk_vfu_tgt.a 00:01:59.641 CC lib/event/app.o 00:01:59.641 CC lib/event/reactor.o 00:01:59.641 CC lib/event/log_rpc.o 00:01:59.641 CC lib/event/scheduler_static.o 00:01:59.641 CC lib/event/app_rpc.o 00:01:59.900 LIB libspdk_event.a 00:01:59.900 LIB libspdk_accel.a 00:01:59.900 LIB libspdk_nvme.a 00:02:00.157 CC lib/bdev/bdev.o 00:02:00.157 CC lib/bdev/bdev_rpc.o 00:02:00.157 CC lib/bdev/part.o 00:02:00.157 CC lib/bdev/bdev_zone.o 00:02:00.157 CC lib/bdev/scsi_nvme.o 00:02:01.092 LIB libspdk_blob.a 00:02:01.092 CC lib/lvol/lvol.o 00:02:01.092 CC lib/blobfs/blobfs.o 00:02:01.092 CC lib/blobfs/tree.o 00:02:01.657 LIB libspdk_lvol.a 00:02:01.657 LIB libspdk_blobfs.a 00:02:01.915 LIB libspdk_bdev.a 00:02:02.174 CC lib/nbd/nbd.o 00:02:02.174 CC lib/nbd/nbd_rpc.o 00:02:02.174 CC lib/ublk/ublk.o 00:02:02.174 CC lib/ublk/ublk_rpc.o 00:02:02.174 CC lib/ftl/ftl_core.o 00:02:02.174 CC lib/ftl/ftl_init.o 00:02:02.174 CC lib/ftl/ftl_io.o 00:02:02.174 CC lib/ftl/ftl_layout.o 00:02:02.174 CC lib/ftl/ftl_debug.o 00:02:02.174 CC lib/ftl/ftl_sb.o 00:02:02.174 CC lib/nvmf/ctrlr.o 00:02:02.174 CC lib/nvmf/ctrlr_discovery.o 00:02:02.174 CC lib/ftl/ftl_l2p.o 00:02:02.174 CC lib/ftl/ftl_l2p_flat.o 00:02:02.174 CC lib/nvmf/ctrlr_bdev.o 00:02:02.174 CC lib/ftl/ftl_nv_cache.o 00:02:02.174 CC lib/ftl/ftl_band.o 00:02:02.174 CC lib/nvmf/nvmf.o 00:02:02.174 CC lib/nvmf/subsystem.o 00:02:02.175 CC lib/ftl/ftl_band_ops.o 00:02:02.175 CC lib/ftl/ftl_writer.o 00:02:02.175 CC lib/nvmf/nvmf_rpc.o 00:02:02.175 CC lib/ftl/ftl_rq.o 00:02:02.175 CC lib/nvmf/transport.o 00:02:02.175 CC lib/ftl/ftl_reloc.o 00:02:02.175 CC lib/ftl/ftl_p2l.o 00:02:02.175 CC lib/ftl/ftl_l2p_cache.o 00:02:02.175 CC lib/nvmf/tcp.o 00:02:02.175 CC lib/nvmf/vfio_user.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:02.175 CC lib/nvmf/rdma.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:02.175 CC lib/scsi/dev.o 00:02:02.175 CC lib/scsi/port.o 00:02:02.175 CC lib/scsi/lun.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:02.175 CC lib/scsi/scsi.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:02.175 CC lib/scsi/scsi_bdev.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:02.175 CC lib/scsi/scsi_pr.o 00:02:02.175 CC lib/scsi/scsi_rpc.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:02.175 CC lib/scsi/task.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:02.175 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:02.434 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:02.434 CC lib/ftl/utils/ftl_md.o 00:02:02.434 CC lib/ftl/utils/ftl_conf.o 00:02:02.434 CC lib/ftl/utils/ftl_mempool.o 00:02:02.434 CC lib/ftl/utils/ftl_bitmap.o 00:02:02.434 CC lib/ftl/utils/ftl_property.o 00:02:02.434 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:02.434 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:02.434 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:02.434 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:02.434 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:02.434 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:02.434 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:02.434 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:02.434 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:02.434 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:02.434 CC lib/ftl/base/ftl_base_dev.o 00:02:02.434 CC lib/ftl/base/ftl_base_bdev.o 00:02:02.434 CC lib/ftl/ftl_trace.o 00:02:02.692 LIB libspdk_nbd.a 00:02:02.950 LIB libspdk_ublk.a 00:02:02.950 LIB libspdk_scsi.a 00:02:02.950 LIB libspdk_ftl.a 00:02:03.207 CC lib/vhost/vhost.o 00:02:03.207 CC lib/vhost/rte_vhost_user.o 00:02:03.207 CC lib/vhost/vhost_rpc.o 00:02:03.207 CC lib/vhost/vhost_scsi.o 00:02:03.207 CC lib/vhost/vhost_blk.o 00:02:03.207 CC lib/iscsi/conn.o 00:02:03.207 CC lib/iscsi/iscsi.o 00:02:03.207 CC lib/iscsi/init_grp.o 00:02:03.207 CC lib/iscsi/param.o 00:02:03.207 CC lib/iscsi/portal_grp.o 00:02:03.207 CC lib/iscsi/md5.o 00:02:03.207 CC lib/iscsi/iscsi_rpc.o 00:02:03.207 CC lib/iscsi/tgt_node.o 00:02:03.207 CC lib/iscsi/iscsi_subsystem.o 00:02:03.207 CC lib/iscsi/task.o 00:02:03.772 LIB libspdk_nvmf.a 00:02:03.772 LIB libspdk_vhost.a 00:02:04.030 LIB libspdk_iscsi.a 00:02:04.288 CC module/vfu_device/vfu_virtio.o 00:02:04.288 CC module/vfu_device/vfu_virtio_blk.o 00:02:04.288 CC module/vfu_device/vfu_virtio_scsi.o 00:02:04.288 CC module/vfu_device/vfu_virtio_rpc.o 00:02:04.288 CC module/env_dpdk/env_dpdk_rpc.o 00:02:04.546 CC module/accel/iaa/accel_iaa.o 00:02:04.546 CC module/accel/iaa/accel_iaa_rpc.o 00:02:04.546 CC module/sock/posix/posix.o 00:02:04.546 CC module/scheduler/gscheduler/gscheduler.o 00:02:04.546 CC module/blob/bdev/blob_bdev.o 00:02:04.546 CC module/accel/error/accel_error.o 00:02:04.546 CC module/accel/error/accel_error_rpc.o 00:02:04.546 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:04.546 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:04.546 LIB libspdk_env_dpdk_rpc.a 00:02:04.546 CC module/accel/dsa/accel_dsa.o 00:02:04.546 CC module/accel/dsa/accel_dsa_rpc.o 00:02:04.546 CC module/keyring/file/keyring.o 00:02:04.546 CC module/keyring/file/keyring_rpc.o 00:02:04.546 CC module/accel/ioat/accel_ioat.o 00:02:04.546 CC module/accel/ioat/accel_ioat_rpc.o 00:02:04.546 LIB libspdk_scheduler_gscheduler.a 00:02:04.546 LIB libspdk_accel_error.a 00:02:04.546 LIB libspdk_accel_iaa.a 00:02:04.546 LIB libspdk_scheduler_dpdk_governor.a 00:02:04.546 LIB libspdk_scheduler_dynamic.a 00:02:04.546 LIB libspdk_keyring_file.a 00:02:04.546 LIB libspdk_blob_bdev.a 00:02:04.805 LIB libspdk_accel_dsa.a 00:02:04.805 LIB libspdk_accel_ioat.a 00:02:04.805 LIB libspdk_vfu_device.a 00:02:05.064 LIB libspdk_sock_posix.a 00:02:05.064 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:05.064 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:05.064 CC module/bdev/nvme/bdev_nvme.o 00:02:05.064 CC module/bdev/nvme/nvme_rpc.o 00:02:05.064 CC module/bdev/iscsi/bdev_iscsi.o 00:02:05.064 CC module/bdev/nvme/vbdev_opal.o 00:02:05.064 CC module/bdev/nvme/bdev_mdns_client.o 00:02:05.064 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:05.064 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:05.064 CC module/bdev/error/vbdev_error.o 00:02:05.064 CC module/bdev/passthru/vbdev_passthru.o 00:02:05.064 CC module/bdev/split/vbdev_split.o 00:02:05.064 CC module/bdev/gpt/gpt.o 00:02:05.064 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:05.064 CC module/bdev/error/vbdev_error_rpc.o 00:02:05.064 CC module/bdev/split/vbdev_split_rpc.o 00:02:05.064 CC module/bdev/gpt/vbdev_gpt.o 00:02:05.064 CC module/bdev/delay/vbdev_delay.o 00:02:05.064 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:05.064 CC module/bdev/ftl/bdev_ftl.o 00:02:05.064 CC module/blobfs/bdev/blobfs_bdev.o 00:02:05.064 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:05.064 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:05.064 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:05.064 CC module/bdev/lvol/vbdev_lvol.o 00:02:05.064 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:05.064 CC module/bdev/malloc/bdev_malloc.o 00:02:05.064 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:05.064 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:05.064 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:05.064 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:05.064 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:05.064 CC module/bdev/null/bdev_null_rpc.o 00:02:05.064 CC module/bdev/null/bdev_null.o 00:02:05.064 CC module/bdev/raid/bdev_raid.o 00:02:05.064 CC module/bdev/raid/bdev_raid_rpc.o 00:02:05.064 CC module/bdev/raid/raid0.o 00:02:05.064 CC module/bdev/raid/bdev_raid_sb.o 00:02:05.064 CC module/bdev/raid/raid1.o 00:02:05.064 CC module/bdev/raid/concat.o 00:02:05.064 CC module/bdev/aio/bdev_aio.o 00:02:05.064 CC module/bdev/aio/bdev_aio_rpc.o 00:02:05.322 LIB libspdk_bdev_split.a 00:02:05.322 LIB libspdk_bdev_gpt.a 00:02:05.322 LIB libspdk_bdev_null.a 00:02:05.322 LIB libspdk_bdev_passthru.a 00:02:05.322 LIB libspdk_bdev_ftl.a 00:02:05.322 LIB libspdk_blobfs_bdev.a 00:02:05.322 LIB libspdk_bdev_iscsi.a 00:02:05.322 LIB libspdk_bdev_zone_block.a 00:02:05.322 LIB libspdk_bdev_aio.a 00:02:05.322 LIB libspdk_bdev_error.a 00:02:05.322 LIB libspdk_bdev_delay.a 00:02:05.322 LIB libspdk_bdev_lvol.a 00:02:05.322 LIB libspdk_bdev_malloc.a 00:02:05.582 LIB libspdk_bdev_virtio.a 00:02:05.842 LIB libspdk_bdev_raid.a 00:02:06.410 LIB libspdk_bdev_nvme.a 00:02:06.976 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:06.976 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:06.976 CC module/event/subsystems/iobuf/iobuf.o 00:02:06.976 CC module/event/subsystems/vmd/vmd.o 00:02:06.976 CC module/event/subsystems/scheduler/scheduler.o 00:02:06.976 CC module/event/subsystems/keyring/keyring.o 00:02:06.976 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:06.976 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:06.976 CC module/event/subsystems/sock/sock.o 00:02:06.976 LIB libspdk_event_vhost_blk.a 00:02:06.976 LIB libspdk_event_keyring.a 00:02:06.976 LIB libspdk_event_scheduler.a 00:02:06.976 LIB libspdk_event_vmd.a 00:02:06.976 LIB libspdk_event_vfu_tgt.a 00:02:06.976 LIB libspdk_event_sock.a 00:02:06.976 LIB libspdk_event_iobuf.a 00:02:07.544 CC module/event/subsystems/accel/accel.o 00:02:07.544 LIB libspdk_event_accel.a 00:02:07.803 CC module/event/subsystems/bdev/bdev.o 00:02:07.804 LIB libspdk_event_bdev.a 00:02:08.372 CC module/event/subsystems/scsi/scsi.o 00:02:08.372 CC module/event/subsystems/ublk/ublk.o 00:02:08.372 CC module/event/subsystems/nbd/nbd.o 00:02:08.372 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:08.372 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:08.372 LIB libspdk_event_ublk.a 00:02:08.372 LIB libspdk_event_scsi.a 00:02:08.372 LIB libspdk_event_nbd.a 00:02:08.372 LIB libspdk_event_nvmf.a 00:02:08.631 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:08.631 CC module/event/subsystems/iscsi/iscsi.o 00:02:08.631 LIB libspdk_event_vhost_scsi.a 00:02:08.889 LIB libspdk_event_iscsi.a 00:02:09.154 CC app/trace_record/trace_record.o 00:02:09.154 CC app/spdk_top/spdk_top.o 00:02:09.154 CXX app/trace/trace.o 00:02:09.154 CC app/spdk_nvme_perf/perf.o 00:02:09.154 CC app/spdk_lspci/spdk_lspci.o 00:02:09.154 CC app/spdk_nvme_identify/identify.o 00:02:09.154 CC test/rpc_client/rpc_client_test.o 00:02:09.154 CC app/spdk_nvme_discover/discovery_aer.o 00:02:09.154 TEST_HEADER include/spdk/accel.h 00:02:09.154 TEST_HEADER include/spdk/accel_module.h 00:02:09.154 TEST_HEADER include/spdk/assert.h 00:02:09.154 TEST_HEADER include/spdk/barrier.h 00:02:09.154 TEST_HEADER include/spdk/base64.h 00:02:09.154 TEST_HEADER include/spdk/bdev.h 00:02:09.154 TEST_HEADER include/spdk/bdev_module.h 00:02:09.154 TEST_HEADER include/spdk/bdev_zone.h 00:02:09.154 TEST_HEADER include/spdk/bit_array.h 00:02:09.154 TEST_HEADER include/spdk/bit_pool.h 00:02:09.154 TEST_HEADER include/spdk/blob_bdev.h 00:02:09.154 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:09.154 TEST_HEADER include/spdk/blobfs.h 00:02:09.154 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:09.154 CC app/nvmf_tgt/nvmf_main.o 00:02:09.154 CC app/spdk_dd/spdk_dd.o 00:02:09.154 TEST_HEADER include/spdk/blob.h 00:02:09.154 CC app/iscsi_tgt/iscsi_tgt.o 00:02:09.154 TEST_HEADER include/spdk/conf.h 00:02:09.154 TEST_HEADER include/spdk/config.h 00:02:09.154 TEST_HEADER include/spdk/cpuset.h 00:02:09.154 CC app/vhost/vhost.o 00:02:09.154 TEST_HEADER include/spdk/crc16.h 00:02:09.154 TEST_HEADER include/spdk/crc32.h 00:02:09.154 TEST_HEADER include/spdk/crc64.h 00:02:09.154 TEST_HEADER include/spdk/dif.h 00:02:09.154 TEST_HEADER include/spdk/dma.h 00:02:09.154 TEST_HEADER include/spdk/endian.h 00:02:09.154 CC examples/ioat/perf/perf.o 00:02:09.154 TEST_HEADER include/spdk/env_dpdk.h 00:02:09.154 CC app/spdk_tgt/spdk_tgt.o 00:02:09.154 TEST_HEADER include/spdk/env.h 00:02:09.154 TEST_HEADER include/spdk/event.h 00:02:09.154 CC examples/vmd/lsvmd/lsvmd.o 00:02:09.154 CC app/fio/nvme/fio_plugin.o 00:02:09.154 CC examples/accel/perf/accel_perf.o 00:02:09.154 TEST_HEADER include/spdk/fd_group.h 00:02:09.154 CC examples/ioat/verify/verify.o 00:02:09.154 TEST_HEADER include/spdk/fd.h 00:02:09.154 TEST_HEADER include/spdk/file.h 00:02:09.154 CC test/nvme/sgl/sgl.o 00:02:09.154 CC test/app/stub/stub.o 00:02:09.154 CC test/thread/lock/spdk_lock.o 00:02:09.154 TEST_HEADER include/spdk/ftl.h 00:02:09.154 CC test/nvme/boot_partition/boot_partition.o 00:02:09.154 CC test/nvme/reserve/reserve.o 00:02:09.154 TEST_HEADER include/spdk/gpt_spec.h 00:02:09.154 CC test/nvme/overhead/overhead.o 00:02:09.154 CC examples/nvme/reconnect/reconnect.o 00:02:09.154 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:09.154 CC test/nvme/reset/reset.o 00:02:09.154 TEST_HEADER include/spdk/hexlify.h 00:02:09.154 CC test/nvme/err_injection/err_injection.o 00:02:09.154 CC test/nvme/cuse/cuse.o 00:02:09.154 CC test/event/reactor/reactor.o 00:02:09.154 CC examples/vmd/led/led.o 00:02:09.154 CC examples/sock/hello_world/hello_sock.o 00:02:09.154 CC test/nvme/e2edp/nvme_dp.o 00:02:09.154 CC test/env/pci/pci_ut.o 00:02:09.154 CC test/thread/poller_perf/poller_perf.o 00:02:09.154 CC test/nvme/connect_stress/connect_stress.o 00:02:09.154 CC test/nvme/fused_ordering/fused_ordering.o 00:02:09.154 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:09.154 CC examples/idxd/perf/perf.o 00:02:09.154 TEST_HEADER include/spdk/histogram_data.h 00:02:09.154 CC test/env/memory/memory_ut.o 00:02:09.154 CC test/event/event_perf/event_perf.o 00:02:09.154 CC test/nvme/startup/startup.o 00:02:09.154 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:09.154 CC test/nvme/fdp/fdp.o 00:02:09.155 CC test/nvme/compliance/nvme_compliance.o 00:02:09.155 CC test/app/histogram_perf/histogram_perf.o 00:02:09.155 CC examples/util/zipf/zipf.o 00:02:09.155 CC test/env/vtophys/vtophys.o 00:02:09.155 TEST_HEADER include/spdk/idxd.h 00:02:09.155 CC examples/nvme/hello_world/hello_world.o 00:02:09.155 CC test/event/reactor_perf/reactor_perf.o 00:02:09.155 CC test/nvme/aer/aer.o 00:02:09.155 CC test/nvme/simple_copy/simple_copy.o 00:02:09.155 TEST_HEADER include/spdk/idxd_spec.h 00:02:09.155 CC test/app/jsoncat/jsoncat.o 00:02:09.155 TEST_HEADER include/spdk/init.h 00:02:09.155 TEST_HEADER include/spdk/ioat.h 00:02:09.155 TEST_HEADER include/spdk/ioat_spec.h 00:02:09.155 TEST_HEADER include/spdk/iscsi_spec.h 00:02:09.155 CC test/event/app_repeat/app_repeat.o 00:02:09.155 TEST_HEADER include/spdk/json.h 00:02:09.155 LINK spdk_lspci 00:02:09.155 TEST_HEADER include/spdk/jsonrpc.h 00:02:09.155 TEST_HEADER include/spdk/keyring.h 00:02:09.155 CC app/fio/bdev/fio_plugin.o 00:02:09.155 TEST_HEADER include/spdk/keyring_module.h 00:02:09.155 CC test/bdev/bdevio/bdevio.o 00:02:09.155 CC examples/blob/cli/blobcli.o 00:02:09.155 TEST_HEADER include/spdk/likely.h 00:02:09.155 CC examples/nvmf/nvmf/nvmf.o 00:02:09.415 TEST_HEADER include/spdk/log.h 00:02:09.415 TEST_HEADER include/spdk/lvol.h 00:02:09.415 TEST_HEADER include/spdk/memory.h 00:02:09.415 CC examples/blob/hello_world/hello_blob.o 00:02:09.415 TEST_HEADER include/spdk/mmio.h 00:02:09.415 CC examples/bdev/hello_world/hello_bdev.o 00:02:09.415 CC test/accel/dif/dif.o 00:02:09.415 TEST_HEADER include/spdk/nbd.h 00:02:09.415 TEST_HEADER include/spdk/notify.h 00:02:09.415 TEST_HEADER include/spdk/nvme.h 00:02:09.415 CC test/event/scheduler/scheduler.o 00:02:09.415 TEST_HEADER include/spdk/nvme_intel.h 00:02:09.415 CC test/blobfs/mkfs/mkfs.o 00:02:09.415 CC test/app/bdev_svc/bdev_svc.o 00:02:09.415 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:09.415 CC examples/thread/thread/thread_ex.o 00:02:09.415 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:09.415 CC examples/bdev/bdevperf/bdevperf.o 00:02:09.415 TEST_HEADER include/spdk/nvme_spec.h 00:02:09.415 TEST_HEADER include/spdk/nvme_zns.h 00:02:09.415 CC test/dma/test_dma/test_dma.o 00:02:09.415 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:09.415 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:09.415 TEST_HEADER include/spdk/nvmf.h 00:02:09.415 TEST_HEADER include/spdk/nvmf_spec.h 00:02:09.415 TEST_HEADER include/spdk/nvmf_transport.h 00:02:09.415 TEST_HEADER include/spdk/opal.h 00:02:09.415 TEST_HEADER include/spdk/opal_spec.h 00:02:09.415 TEST_HEADER include/spdk/pci_ids.h 00:02:09.415 TEST_HEADER include/spdk/pipe.h 00:02:09.415 TEST_HEADER include/spdk/queue.h 00:02:09.415 TEST_HEADER include/spdk/reduce.h 00:02:09.415 CC test/lvol/esnap/esnap.o 00:02:09.415 TEST_HEADER include/spdk/rpc.h 00:02:09.415 CC test/env/mem_callbacks/mem_callbacks.o 00:02:09.415 LINK rpc_client_test 00:02:09.415 TEST_HEADER include/spdk/scheduler.h 00:02:09.415 TEST_HEADER include/spdk/scsi.h 00:02:09.415 TEST_HEADER include/spdk/scsi_spec.h 00:02:09.415 TEST_HEADER include/spdk/sock.h 00:02:09.415 TEST_HEADER include/spdk/stdinc.h 00:02:09.415 TEST_HEADER include/spdk/string.h 00:02:09.415 TEST_HEADER include/spdk/thread.h 00:02:09.415 TEST_HEADER include/spdk/trace.h 00:02:09.415 TEST_HEADER include/spdk/trace_parser.h 00:02:09.415 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:09.415 TEST_HEADER include/spdk/tree.h 00:02:09.415 LINK spdk_nvme_discover 00:02:09.415 TEST_HEADER include/spdk/ublk.h 00:02:09.415 TEST_HEADER include/spdk/util.h 00:02:09.415 LINK interrupt_tgt 00:02:09.415 TEST_HEADER include/spdk/uuid.h 00:02:09.415 TEST_HEADER include/spdk/version.h 00:02:09.415 LINK lsvmd 00:02:09.415 LINK spdk_trace_record 00:02:09.415 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:09.415 LINK nvmf_tgt 00:02:09.415 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:09.415 TEST_HEADER include/spdk/vhost.h 00:02:09.415 TEST_HEADER include/spdk/vmd.h 00:02:09.415 TEST_HEADER include/spdk/xor.h 00:02:09.415 TEST_HEADER include/spdk/zipf.h 00:02:09.415 LINK reactor 00:02:09.415 LINK histogram_perf 00:02:09.415 LINK iscsi_tgt 00:02:09.415 CXX test/cpp_headers/accel.o 00:02:09.415 LINK led 00:02:09.415 LINK jsoncat 00:02:09.415 LINK poller_perf 00:02:09.415 LINK reactor_perf 00:02:09.415 LINK zipf 00:02:09.415 LINK vtophys 00:02:09.415 LINK event_perf 00:02:09.415 LINK vhost 00:02:09.415 LINK boot_partition 00:02:09.415 LINK app_repeat 00:02:09.415 LINK fused_ordering 00:02:09.415 LINK stub 00:02:09.415 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:09.415 struct spdk_nvme_fdp_ruhs ruhs; 00:02:09.415 ^ 00:02:09.415 LINK ioat_perf 00:02:09.415 LINK env_dpdk_post_init 00:02:09.415 LINK startup 00:02:09.415 LINK reserve 00:02:09.415 LINK err_injection 00:02:09.415 LINK verify 00:02:09.415 LINK doorbell_aers 00:02:09.415 LINK connect_stress 00:02:09.415 LINK hello_world 00:02:09.682 LINK spdk_tgt 00:02:09.682 LINK hello_sock 00:02:09.682 LINK simple_copy 00:02:09.682 LINK bdev_svc 00:02:09.682 LINK sgl 00:02:09.682 LINK mkfs 00:02:09.682 LINK reset 00:02:09.682 LINK fdp 00:02:09.682 LINK nvme_dp 00:02:09.682 LINK overhead 00:02:09.682 LINK aer 00:02:09.682 LINK scheduler 00:02:09.682 LINK hello_bdev 00:02:09.682 LINK thread 00:02:09.682 LINK hello_blob 00:02:09.682 LINK idxd_perf 00:02:09.682 LINK spdk_trace 00:02:09.682 CXX test/cpp_headers/accel_module.o 00:02:09.682 LINK nvmf 00:02:09.682 LINK reconnect 00:02:09.682 LINK spdk_dd 00:02:09.682 LINK bdevio 00:02:09.682 LINK dif 00:02:09.941 LINK test_dma 00:02:09.941 LINK pci_ut 00:02:09.941 LINK nvme_compliance 00:02:09.941 LINK accel_perf 00:02:09.941 LINK nvme_manage 00:02:09.941 1 warning generated. 00:02:09.941 CXX test/cpp_headers/assert.o 00:02:09.941 LINK spdk_nvme 00:02:09.941 LINK nvme_fuzz 00:02:09.941 LINK blobcli 00:02:09.941 LINK spdk_bdev 00:02:09.941 LINK spdk_nvme_identify 00:02:09.941 LINK mem_callbacks 00:02:09.941 CXX test/cpp_headers/barrier.o 00:02:10.200 CC examples/nvme/arbitration/arbitration.o 00:02:10.200 LINK spdk_top 00:02:10.200 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:10.200 LINK spdk_nvme_perf 00:02:10.200 CC examples/nvme/hotplug/hotplug.o 00:02:10.200 CXX test/cpp_headers/base64.o 00:02:10.200 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:10.200 CXX test/cpp_headers/bdev.o 00:02:10.200 LINK bdevperf 00:02:10.200 CXX test/cpp_headers/bdev_module.o 00:02:10.462 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:10.462 LINK memory_ut 00:02:10.462 CXX test/cpp_headers/bdev_zone.o 00:02:10.462 CXX test/cpp_headers/bit_array.o 00:02:10.462 CXX test/cpp_headers/bit_pool.o 00:02:10.462 CC examples/nvme/abort/abort.o 00:02:10.462 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:10.462 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:10.462 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:10.462 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:10.462 CXX test/cpp_headers/blob_bdev.o 00:02:10.462 LINK cuse 00:02:10.462 CXX test/cpp_headers/blobfs_bdev.o 00:02:10.462 LINK hotplug 00:02:10.462 CXX test/cpp_headers/blobfs.o 00:02:10.462 CXX test/cpp_headers/blob.o 00:02:10.462 CXX test/cpp_headers/conf.o 00:02:10.462 CXX test/cpp_headers/config.o 00:02:10.462 LINK cmb_copy 00:02:10.721 CXX test/cpp_headers/cpuset.o 00:02:10.721 CXX test/cpp_headers/crc16.o 00:02:10.721 CXX test/cpp_headers/crc32.o 00:02:10.721 CXX test/cpp_headers/crc64.o 00:02:10.721 CXX test/cpp_headers/dif.o 00:02:10.721 CXX test/cpp_headers/dma.o 00:02:10.721 LINK arbitration 00:02:10.721 LINK pmr_persistence 00:02:10.721 CXX test/cpp_headers/endian.o 00:02:10.721 CXX test/cpp_headers/env_dpdk.o 00:02:10.721 CXX test/cpp_headers/env.o 00:02:10.721 CXX test/cpp_headers/event.o 00:02:10.721 CXX test/cpp_headers/fd_group.o 00:02:10.721 CXX test/cpp_headers/fd.o 00:02:10.721 CXX test/cpp_headers/file.o 00:02:10.721 CXX test/cpp_headers/ftl.o 00:02:10.721 LINK llvm_vfio_fuzz 00:02:10.721 CXX test/cpp_headers/gpt_spec.o 00:02:10.721 CXX test/cpp_headers/hexlify.o 00:02:10.721 CXX test/cpp_headers/histogram_data.o 00:02:10.721 CXX test/cpp_headers/idxd.o 00:02:10.721 CXX test/cpp_headers/idxd_spec.o 00:02:10.721 LINK abort 00:02:10.721 CXX test/cpp_headers/init.o 00:02:10.721 CXX test/cpp_headers/ioat.o 00:02:10.990 CXX test/cpp_headers/ioat_spec.o 00:02:10.990 CXX test/cpp_headers/iscsi_spec.o 00:02:10.990 CXX test/cpp_headers/json.o 00:02:10.990 CXX test/cpp_headers/jsonrpc.o 00:02:10.990 CXX test/cpp_headers/keyring.o 00:02:10.990 CXX test/cpp_headers/keyring_module.o 00:02:10.990 CXX test/cpp_headers/likely.o 00:02:10.990 CXX test/cpp_headers/log.o 00:02:10.990 CXX test/cpp_headers/lvol.o 00:02:10.990 CXX test/cpp_headers/memory.o 00:02:10.990 CXX test/cpp_headers/mmio.o 00:02:10.990 CXX test/cpp_headers/nbd.o 00:02:10.990 CXX test/cpp_headers/notify.o 00:02:10.990 CXX test/cpp_headers/nvme.o 00:02:10.990 CXX test/cpp_headers/nvme_intel.o 00:02:10.990 CXX test/cpp_headers/nvme_ocssd.o 00:02:10.990 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:10.990 CXX test/cpp_headers/nvme_zns.o 00:02:10.990 CXX test/cpp_headers/nvme_spec.o 00:02:10.990 CXX test/cpp_headers/nvmf_cmd.o 00:02:10.990 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:10.990 CXX test/cpp_headers/nvmf.o 00:02:10.990 LINK vhost_fuzz 00:02:10.990 CXX test/cpp_headers/nvmf_spec.o 00:02:10.990 CXX test/cpp_headers/nvmf_transport.o 00:02:10.990 CXX test/cpp_headers/opal.o 00:02:10.990 CXX test/cpp_headers/opal_spec.o 00:02:10.990 CXX test/cpp_headers/pci_ids.o 00:02:10.990 CXX test/cpp_headers/pipe.o 00:02:10.990 CXX test/cpp_headers/queue.o 00:02:10.990 CXX test/cpp_headers/reduce.o 00:02:10.990 CXX test/cpp_headers/rpc.o 00:02:10.990 CXX test/cpp_headers/scheduler.o 00:02:10.990 CXX test/cpp_headers/scsi.o 00:02:10.990 CXX test/cpp_headers/scsi_spec.o 00:02:10.990 CXX test/cpp_headers/sock.o 00:02:10.990 CXX test/cpp_headers/stdinc.o 00:02:10.990 CXX test/cpp_headers/string.o 00:02:10.990 CXX test/cpp_headers/thread.o 00:02:10.990 CXX test/cpp_headers/trace.o 00:02:10.990 CXX test/cpp_headers/trace_parser.o 00:02:10.990 CXX test/cpp_headers/tree.o 00:02:10.990 CXX test/cpp_headers/ublk.o 00:02:10.990 CXX test/cpp_headers/util.o 00:02:10.990 CXX test/cpp_headers/uuid.o 00:02:10.990 CXX test/cpp_headers/version.o 00:02:10.990 CXX test/cpp_headers/vfio_user_pci.o 00:02:10.990 CXX test/cpp_headers/vfio_user_spec.o 00:02:11.249 CXX test/cpp_headers/vhost.o 00:02:11.249 CXX test/cpp_headers/vmd.o 00:02:11.249 CXX test/cpp_headers/xor.o 00:02:11.249 CXX test/cpp_headers/zipf.o 00:02:11.249 LINK llvm_nvme_fuzz 00:02:11.249 LINK spdk_lock 00:02:11.816 LINK iscsi_fuzz 00:02:13.262 LINK esnap 00:02:13.829 00:02:13.829 real 0m43.190s 00:02:13.829 user 6m40.502s 00:02:13.829 sys 2m31.767s 00:02:13.829 19:09:00 -- common/autotest_common.sh@1112 -- $ xtrace_disable 00:02:13.829 19:09:00 -- common/autotest_common.sh@10 -- $ set +x 00:02:13.829 ************************************ 00:02:13.829 END TEST make 00:02:13.829 ************************************ 00:02:13.829 19:09:00 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:13.829 19:09:00 -- pm/common@30 -- $ signal_monitor_resources TERM 00:02:13.829 19:09:00 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:02:13.829 19:09:00 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.829 19:09:00 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:13.829 19:09:00 -- pm/common@45 -- $ pid=1495712 00:02:13.829 19:09:00 -- pm/common@52 -- $ sudo kill -TERM 1495712 00:02:13.829 19:09:00 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.829 19:09:00 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:13.829 19:09:00 -- pm/common@45 -- $ pid=1495718 00:02:13.829 19:09:00 -- pm/common@52 -- $ sudo kill -TERM 1495718 00:02:13.829 19:09:00 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.829 19:09:00 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:13.829 19:09:00 -- pm/common@45 -- $ pid=1495716 00:02:13.829 19:09:00 -- pm/common@52 -- $ sudo kill -TERM 1495716 00:02:13.829 19:09:00 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:13.829 19:09:00 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:13.829 19:09:00 -- pm/common@45 -- $ pid=1495719 00:02:13.829 19:09:00 -- pm/common@52 -- $ sudo kill -TERM 1495719 00:02:14.088 19:09:00 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:14.088 19:09:00 -- nvmf/common.sh@7 -- # uname -s 00:02:14.088 19:09:00 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:14.088 19:09:00 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:14.088 19:09:00 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:14.088 19:09:00 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:14.088 19:09:00 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:14.088 19:09:00 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:14.088 19:09:00 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:14.088 19:09:00 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:14.088 19:09:00 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:14.088 19:09:00 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:14.088 19:09:00 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:02:14.088 19:09:00 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:02:14.089 19:09:00 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:14.089 19:09:00 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:14.089 19:09:00 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:14.089 19:09:00 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:14.089 19:09:00 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:14.089 19:09:01 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:14.089 19:09:01 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:14.089 19:09:01 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:14.089 19:09:01 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.089 19:09:01 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.089 19:09:01 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.089 19:09:01 -- paths/export.sh@5 -- # export PATH 00:02:14.089 19:09:01 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.089 19:09:01 -- nvmf/common.sh@47 -- # : 0 00:02:14.089 19:09:01 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:14.089 19:09:01 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:14.089 19:09:01 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:14.089 19:09:01 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:14.089 19:09:01 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:14.089 19:09:01 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:14.089 19:09:01 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:14.089 19:09:01 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:14.089 19:09:01 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:14.089 19:09:01 -- spdk/autotest.sh@32 -- # uname -s 00:02:14.089 19:09:01 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:14.089 19:09:01 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:14.089 19:09:01 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:14.089 19:09:01 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:14.089 19:09:01 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:14.089 19:09:01 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:14.089 19:09:01 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:14.089 19:09:01 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:14.089 19:09:01 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:14.089 19:09:01 -- spdk/autotest.sh@48 -- # udevadm_pid=1552827 00:02:14.089 19:09:01 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:14.089 19:09:01 -- pm/common@17 -- # local monitor 00:02:14.089 19:09:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.089 19:09:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1552829 00:02:14.089 19:09:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.089 19:09:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1552832 00:02:14.089 19:09:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.089 19:09:01 -- pm/common@21 -- # date +%s 00:02:14.089 19:09:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1552834 00:02:14.089 19:09:01 -- pm/common@21 -- # date +%s 00:02:14.089 19:09:01 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.089 19:09:01 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1552837 00:02:14.089 19:09:01 -- pm/common@26 -- # sleep 1 00:02:14.089 19:09:01 -- pm/common@21 -- # date +%s 00:02:14.089 19:09:01 -- pm/common@21 -- # date +%s 00:02:14.089 19:09:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713978541 00:02:14.089 19:09:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713978541 00:02:14.089 19:09:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713978541 00:02:14.089 19:09:01 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713978541 00:02:14.348 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713978541_collect-bmc-pm.bmc.pm.log 00:02:14.348 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713978541_collect-vmstat.pm.log 00:02:14.348 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713978541_collect-cpu-load.pm.log 00:02:14.348 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1713978541_collect-cpu-temp.pm.log 00:02:15.282 19:09:02 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:15.282 19:09:02 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:15.282 19:09:02 -- common/autotest_common.sh@710 -- # xtrace_disable 00:02:15.282 19:09:02 -- common/autotest_common.sh@10 -- # set +x 00:02:15.282 19:09:02 -- spdk/autotest.sh@59 -- # create_test_list 00:02:15.282 19:09:02 -- common/autotest_common.sh@734 -- # xtrace_disable 00:02:15.282 19:09:02 -- common/autotest_common.sh@10 -- # set +x 00:02:15.282 19:09:02 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:15.282 19:09:02 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:15.282 19:09:02 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:15.282 19:09:02 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:15.282 19:09:02 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:15.282 19:09:02 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:15.282 19:09:02 -- common/autotest_common.sh@1441 -- # uname 00:02:15.282 19:09:02 -- common/autotest_common.sh@1441 -- # '[' Linux = FreeBSD ']' 00:02:15.282 19:09:02 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:15.282 19:09:02 -- common/autotest_common.sh@1461 -- # uname 00:02:15.282 19:09:02 -- common/autotest_common.sh@1461 -- # [[ Linux = FreeBSD ]] 00:02:15.282 19:09:02 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:15.282 19:09:02 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=clang 00:02:15.282 19:09:02 -- spdk/autotest.sh@72 -- # hash lcov 00:02:15.282 19:09:02 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:15.282 19:09:02 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:15.282 19:09:02 -- common/autotest_common.sh@710 -- # xtrace_disable 00:02:15.282 19:09:02 -- common/autotest_common.sh@10 -- # set +x 00:02:15.282 19:09:02 -- spdk/autotest.sh@91 -- # rm -f 00:02:15.282 19:09:02 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:18.569 0000:1a:00.0 (8086 0a54): Already using the nvme driver 00:02:18.569 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:18.569 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:18.827 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:18.827 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:18.827 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:20.731 19:09:07 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:02:20.731 19:09:07 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:02:20.731 19:09:07 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:02:20.731 19:09:07 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:02:20.731 19:09:07 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:20.731 19:09:07 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:02:20.731 19:09:07 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:02:20.731 19:09:07 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:20.731 19:09:07 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:20.731 19:09:07 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:02:20.731 19:09:07 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:20.731 19:09:07 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:20.731 19:09:07 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:02:20.731 19:09:07 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:02:20.731 19:09:07 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:20.731 No valid GPT data, bailing 00:02:20.731 19:09:07 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:20.731 19:09:07 -- scripts/common.sh@391 -- # pt= 00:02:20.731 19:09:07 -- scripts/common.sh@392 -- # return 1 00:02:20.731 19:09:07 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:20.731 1+0 records in 00:02:20.731 1+0 records out 00:02:20.731 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00661669 s, 158 MB/s 00:02:20.731 19:09:07 -- spdk/autotest.sh@118 -- # sync 00:02:20.731 19:09:07 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:20.731 19:09:07 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:20.731 19:09:07 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:25.996 19:09:12 -- spdk/autotest.sh@124 -- # uname -s 00:02:25.996 19:09:12 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:02:25.996 19:09:12 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:25.996 19:09:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:25.996 19:09:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:25.996 19:09:12 -- common/autotest_common.sh@10 -- # set +x 00:02:25.996 ************************************ 00:02:25.996 START TEST setup.sh 00:02:25.996 ************************************ 00:02:25.996 19:09:12 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:25.996 * Looking for test storage... 00:02:25.996 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:25.996 19:09:12 -- setup/test-setup.sh@10 -- # uname -s 00:02:25.996 19:09:12 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:25.996 19:09:12 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:25.996 19:09:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:25.996 19:09:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:25.996 19:09:12 -- common/autotest_common.sh@10 -- # set +x 00:02:25.996 ************************************ 00:02:25.996 START TEST acl 00:02:25.996 ************************************ 00:02:25.996 19:09:12 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:25.996 * Looking for test storage... 00:02:25.997 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:25.997 19:09:12 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:25.997 19:09:12 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:02:25.997 19:09:12 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:02:25.997 19:09:12 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:02:25.997 19:09:12 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:25.997 19:09:12 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:02:25.997 19:09:12 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:02:25.997 19:09:12 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:25.997 19:09:12 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:25.997 19:09:12 -- setup/acl.sh@12 -- # devs=() 00:02:25.997 19:09:12 -- setup/acl.sh@12 -- # declare -a devs 00:02:25.997 19:09:12 -- setup/acl.sh@13 -- # drivers=() 00:02:25.997 19:09:12 -- setup/acl.sh@13 -- # declare -A drivers 00:02:25.997 19:09:12 -- setup/acl.sh@51 -- # setup reset 00:02:25.997 19:09:12 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:25.997 19:09:12 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:31.265 19:09:18 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:31.265 19:09:18 -- setup/acl.sh@16 -- # local dev driver 00:02:31.265 19:09:18 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:31.265 19:09:18 -- setup/acl.sh@15 -- # setup output status 00:02:31.265 19:09:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:31.265 19:09:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:34.554 Hugepages 00:02:34.554 node hugesize free / total 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 00:02:34.554 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.554 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.554 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:34.554 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:1a:00.0 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:34.555 19:09:21 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:34.555 19:09:21 -- setup/acl.sh@20 -- # continue 00:02:34.555 19:09:21 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:34.555 19:09:21 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:34.555 19:09:21 -- setup/acl.sh@54 -- # run_test denied denied 00:02:34.555 19:09:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:34.555 19:09:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:34.555 19:09:21 -- common/autotest_common.sh@10 -- # set +x 00:02:34.813 ************************************ 00:02:34.813 START TEST denied 00:02:34.813 ************************************ 00:02:34.813 19:09:21 -- common/autotest_common.sh@1111 -- # denied 00:02:34.813 19:09:21 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:1a:00.0' 00:02:34.813 19:09:21 -- setup/acl.sh@38 -- # setup output config 00:02:34.813 19:09:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:34.814 19:09:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:34.814 19:09:21 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:1a:00.0' 00:02:40.084 0000:1a:00.0 (8086 0a54): Skipping denied controller at 0000:1a:00.0 00:02:40.084 19:09:26 -- setup/acl.sh@40 -- # verify 0000:1a:00.0 00:02:40.084 19:09:26 -- setup/acl.sh@28 -- # local dev driver 00:02:40.084 19:09:26 -- setup/acl.sh@30 -- # for dev in "$@" 00:02:40.084 19:09:26 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:1a:00.0 ]] 00:02:40.084 19:09:26 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:1a:00.0/driver 00:02:40.084 19:09:26 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:40.084 19:09:26 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:40.084 19:09:26 -- setup/acl.sh@41 -- # setup reset 00:02:40.084 19:09:26 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:40.084 19:09:26 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:46.642 00:02:46.642 real 0m11.160s 00:02:46.642 user 0m3.327s 00:02:46.642 sys 0m6.988s 00:02:46.642 19:09:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:02:46.642 19:09:32 -- common/autotest_common.sh@10 -- # set +x 00:02:46.642 ************************************ 00:02:46.642 END TEST denied 00:02:46.642 ************************************ 00:02:46.642 19:09:32 -- setup/acl.sh@55 -- # run_test allowed allowed 00:02:46.642 19:09:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:46.642 19:09:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:46.642 19:09:32 -- common/autotest_common.sh@10 -- # set +x 00:02:46.642 ************************************ 00:02:46.642 START TEST allowed 00:02:46.642 ************************************ 00:02:46.642 19:09:33 -- common/autotest_common.sh@1111 -- # allowed 00:02:46.642 19:09:33 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:1a:00.0 00:02:46.642 19:09:33 -- setup/acl.sh@45 -- # setup output config 00:02:46.642 19:09:33 -- setup/acl.sh@46 -- # grep -E '0000:1a:00.0 .*: nvme -> .*' 00:02:46.642 19:09:33 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:46.642 19:09:33 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:54.754 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:02:54.754 19:09:41 -- setup/acl.sh@47 -- # verify 00:02:54.754 19:09:41 -- setup/acl.sh@28 -- # local dev driver 00:02:54.754 19:09:41 -- setup/acl.sh@48 -- # setup reset 00:02:54.754 19:09:41 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:54.754 19:09:41 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:00.023 00:03:00.023 real 0m13.207s 00:03:00.023 user 0m2.941s 00:03:00.023 sys 0m6.797s 00:03:00.023 19:09:46 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:00.023 19:09:46 -- common/autotest_common.sh@10 -- # set +x 00:03:00.023 ************************************ 00:03:00.023 END TEST allowed 00:03:00.023 ************************************ 00:03:00.023 00:03:00.023 real 0m33.909s 00:03:00.023 user 0m9.619s 00:03:00.023 sys 0m20.149s 00:03:00.023 19:09:46 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:00.023 19:09:46 -- common/autotest_common.sh@10 -- # set +x 00:03:00.023 ************************************ 00:03:00.023 END TEST acl 00:03:00.023 ************************************ 00:03:00.023 19:09:46 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:00.023 19:09:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:00.023 19:09:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:00.023 19:09:46 -- common/autotest_common.sh@10 -- # set +x 00:03:00.023 ************************************ 00:03:00.023 START TEST hugepages 00:03:00.023 ************************************ 00:03:00.023 19:09:46 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:00.023 * Looking for test storage... 00:03:00.023 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:00.023 19:09:46 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:00.023 19:09:46 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:00.023 19:09:46 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:00.023 19:09:46 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:00.023 19:09:46 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:00.023 19:09:46 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:00.023 19:09:46 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:00.023 19:09:46 -- setup/common.sh@18 -- # local node= 00:03:00.023 19:09:46 -- setup/common.sh@19 -- # local var val 00:03:00.023 19:09:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:00.023 19:09:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:00.023 19:09:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:00.023 19:09:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:00.023 19:09:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:00.023 19:09:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 71758260 kB' 'MemAvailable: 76784636 kB' 'Buffers: 20532 kB' 'Cached: 12921584 kB' 'SwapCached: 0 kB' 'Active: 8740740 kB' 'Inactive: 4745244 kB' 'Active(anon): 8110504 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 547172 kB' 'Mapped: 177764 kB' 'Shmem: 7566636 kB' 'KReclaimable: 487080 kB' 'Slab: 895200 kB' 'SReclaimable: 487080 kB' 'SUnreclaim: 408120 kB' 'KernelStack: 16240 kB' 'PageTables: 9044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438216 kB' 'Committed_AS: 9506812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209500 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.023 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.023 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # continue 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.024 19:09:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.024 19:09:46 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.024 19:09:46 -- setup/common.sh@33 -- # echo 2048 00:03:00.024 19:09:46 -- setup/common.sh@33 -- # return 0 00:03:00.024 19:09:46 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:00.024 19:09:46 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:00.024 19:09:46 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:00.024 19:09:46 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:00.024 19:09:46 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:00.024 19:09:46 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:00.024 19:09:46 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:00.024 19:09:46 -- setup/hugepages.sh@207 -- # get_nodes 00:03:00.024 19:09:46 -- setup/hugepages.sh@27 -- # local node 00:03:00.024 19:09:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:00.024 19:09:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:00.024 19:09:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:00.024 19:09:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:00.024 19:09:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:00.024 19:09:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:00.024 19:09:46 -- setup/hugepages.sh@208 -- # clear_hp 00:03:00.024 19:09:46 -- setup/hugepages.sh@37 -- # local node hp 00:03:00.024 19:09:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:00.025 19:09:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.025 19:09:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.025 19:09:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.025 19:09:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.025 19:09:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:00.025 19:09:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.025 19:09:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.025 19:09:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.025 19:09:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.025 19:09:46 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:00.025 19:09:46 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:00.025 19:09:46 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:00.025 19:09:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:00.025 19:09:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:00.025 19:09:46 -- common/autotest_common.sh@10 -- # set +x 00:03:00.025 ************************************ 00:03:00.025 START TEST default_setup 00:03:00.025 ************************************ 00:03:00.025 19:09:46 -- common/autotest_common.sh@1111 -- # default_setup 00:03:00.025 19:09:46 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:00.025 19:09:46 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:00.025 19:09:46 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:00.025 19:09:46 -- setup/hugepages.sh@51 -- # shift 00:03:00.025 19:09:46 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:00.025 19:09:46 -- setup/hugepages.sh@52 -- # local node_ids 00:03:00.025 19:09:46 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:00.025 19:09:46 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:00.025 19:09:46 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:00.025 19:09:46 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:00.025 19:09:46 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:00.025 19:09:46 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:00.025 19:09:46 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:00.025 19:09:46 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:00.025 19:09:46 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:00.025 19:09:46 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:00.025 19:09:46 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:00.025 19:09:46 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:00.025 19:09:46 -- setup/hugepages.sh@73 -- # return 0 00:03:00.025 19:09:46 -- setup/hugepages.sh@137 -- # setup output 00:03:00.025 19:09:46 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:00.025 19:09:46 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:03.309 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:03.310 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:06.604 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:03:08.054 19:09:54 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:08.054 19:09:54 -- setup/hugepages.sh@89 -- # local node 00:03:08.054 19:09:54 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:08.054 19:09:54 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:08.054 19:09:54 -- setup/hugepages.sh@92 -- # local surp 00:03:08.054 19:09:54 -- setup/hugepages.sh@93 -- # local resv 00:03:08.054 19:09:54 -- setup/hugepages.sh@94 -- # local anon 00:03:08.054 19:09:54 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:08.054 19:09:54 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:08.054 19:09:54 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:08.054 19:09:54 -- setup/common.sh@18 -- # local node= 00:03:08.054 19:09:54 -- setup/common.sh@19 -- # local var val 00:03:08.054 19:09:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.054 19:09:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.054 19:09:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.054 19:09:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.054 19:09:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.054 19:09:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.054 19:09:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73944892 kB' 'MemAvailable: 78971188 kB' 'Buffers: 20532 kB' 'Cached: 12921736 kB' 'SwapCached: 0 kB' 'Active: 8756228 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125992 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562584 kB' 'Mapped: 177892 kB' 'Shmem: 7566788 kB' 'KReclaimable: 487000 kB' 'Slab: 894520 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 407520 kB' 'KernelStack: 16144 kB' 'PageTables: 8944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9523700 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209580 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:08.054 19:09:54 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:54 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.054 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.054 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.055 19:09:55 -- setup/common.sh@33 -- # echo 0 00:03:08.055 19:09:55 -- setup/common.sh@33 -- # return 0 00:03:08.055 19:09:55 -- setup/hugepages.sh@97 -- # anon=0 00:03:08.055 19:09:55 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:08.055 19:09:55 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:08.055 19:09:55 -- setup/common.sh@18 -- # local node= 00:03:08.055 19:09:55 -- setup/common.sh@19 -- # local var val 00:03:08.055 19:09:55 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.055 19:09:55 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.055 19:09:55 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.055 19:09:55 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.055 19:09:55 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.055 19:09:55 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73945508 kB' 'MemAvailable: 78971804 kB' 'Buffers: 20532 kB' 'Cached: 12921740 kB' 'SwapCached: 0 kB' 'Active: 8756132 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125896 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562416 kB' 'Mapped: 177748 kB' 'Shmem: 7566792 kB' 'KReclaimable: 487000 kB' 'Slab: 894544 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 407544 kB' 'KernelStack: 16112 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9523712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209580 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.055 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.055 19:09:55 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.056 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.056 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.057 19:09:55 -- setup/common.sh@33 -- # echo 0 00:03:08.057 19:09:55 -- setup/common.sh@33 -- # return 0 00:03:08.057 19:09:55 -- setup/hugepages.sh@99 -- # surp=0 00:03:08.057 19:09:55 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:08.057 19:09:55 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:08.057 19:09:55 -- setup/common.sh@18 -- # local node= 00:03:08.057 19:09:55 -- setup/common.sh@19 -- # local var val 00:03:08.057 19:09:55 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.057 19:09:55 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.057 19:09:55 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.057 19:09:55 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.057 19:09:55 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.057 19:09:55 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.057 19:09:55 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73945872 kB' 'MemAvailable: 78972168 kB' 'Buffers: 20532 kB' 'Cached: 12921752 kB' 'SwapCached: 0 kB' 'Active: 8756268 kB' 'Inactive: 4745244 kB' 'Active(anon): 8126032 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562556 kB' 'Mapped: 177748 kB' 'Shmem: 7566804 kB' 'KReclaimable: 487000 kB' 'Slab: 894544 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 407544 kB' 'KernelStack: 16112 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9523728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209564 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.057 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.057 19:09:55 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.317 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.317 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.318 19:09:55 -- setup/common.sh@33 -- # echo 0 00:03:08.318 19:09:55 -- setup/common.sh@33 -- # return 0 00:03:08.318 19:09:55 -- setup/hugepages.sh@100 -- # resv=0 00:03:08.318 19:09:55 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:08.318 nr_hugepages=1024 00:03:08.318 19:09:55 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:08.318 resv_hugepages=0 00:03:08.318 19:09:55 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:08.318 surplus_hugepages=0 00:03:08.318 19:09:55 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:08.318 anon_hugepages=0 00:03:08.318 19:09:55 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:08.318 19:09:55 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:08.318 19:09:55 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:08.318 19:09:55 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:08.318 19:09:55 -- setup/common.sh@18 -- # local node= 00:03:08.318 19:09:55 -- setup/common.sh@19 -- # local var val 00:03:08.318 19:09:55 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.318 19:09:55 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.318 19:09:55 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.318 19:09:55 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.318 19:09:55 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.318 19:09:55 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73945648 kB' 'MemAvailable: 78971944 kB' 'Buffers: 20532 kB' 'Cached: 12921764 kB' 'SwapCached: 0 kB' 'Active: 8756248 kB' 'Inactive: 4745244 kB' 'Active(anon): 8126012 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562508 kB' 'Mapped: 177748 kB' 'Shmem: 7566816 kB' 'KReclaimable: 487000 kB' 'Slab: 894544 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 407544 kB' 'KernelStack: 16096 kB' 'PageTables: 8816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9523740 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209564 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.318 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.318 19:09:55 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.319 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.319 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.319 19:09:55 -- setup/common.sh@33 -- # echo 1024 00:03:08.319 19:09:55 -- setup/common.sh@33 -- # return 0 00:03:08.319 19:09:55 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:08.319 19:09:55 -- setup/hugepages.sh@112 -- # get_nodes 00:03:08.319 19:09:55 -- setup/hugepages.sh@27 -- # local node 00:03:08.319 19:09:55 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:08.319 19:09:55 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:08.319 19:09:55 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:08.319 19:09:55 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:08.319 19:09:55 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:08.319 19:09:55 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:08.319 19:09:55 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:08.319 19:09:55 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:08.319 19:09:55 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:08.319 19:09:55 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:08.319 19:09:55 -- setup/common.sh@18 -- # local node=0 00:03:08.319 19:09:55 -- setup/common.sh@19 -- # local var val 00:03:08.319 19:09:55 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.319 19:09:55 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.319 19:09:55 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:08.319 19:09:55 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:08.319 19:09:55 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.319 19:09:55 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.320 19:09:55 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 39738676 kB' 'MemUsed: 8378288 kB' 'SwapCached: 0 kB' 'Active: 3640304 kB' 'Inactive: 607904 kB' 'Active(anon): 3260668 kB' 'Inactive(anon): 0 kB' 'Active(file): 379636 kB' 'Inactive(file): 607904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3842608 kB' 'Mapped: 116700 kB' 'AnonPages: 408856 kB' 'Shmem: 2855068 kB' 'KernelStack: 10072 kB' 'PageTables: 5800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329924 kB' 'Slab: 569536 kB' 'SReclaimable: 329924 kB' 'SUnreclaim: 239612 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # continue 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.320 19:09:55 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.320 19:09:55 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.320 19:09:55 -- setup/common.sh@33 -- # echo 0 00:03:08.320 19:09:55 -- setup/common.sh@33 -- # return 0 00:03:08.320 19:09:55 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:08.320 19:09:55 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:08.320 19:09:55 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:08.320 19:09:55 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:08.320 19:09:55 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:08.320 node0=1024 expecting 1024 00:03:08.320 19:09:55 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:08.320 00:03:08.320 real 0m8.310s 00:03:08.320 user 0m1.677s 00:03:08.320 sys 0m3.476s 00:03:08.320 19:09:55 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:08.321 19:09:55 -- common/autotest_common.sh@10 -- # set +x 00:03:08.321 ************************************ 00:03:08.321 END TEST default_setup 00:03:08.321 ************************************ 00:03:08.321 19:09:55 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:08.321 19:09:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:08.321 19:09:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:08.321 19:09:55 -- common/autotest_common.sh@10 -- # set +x 00:03:08.321 ************************************ 00:03:08.321 START TEST per_node_1G_alloc 00:03:08.321 ************************************ 00:03:08.321 19:09:55 -- common/autotest_common.sh@1111 -- # per_node_1G_alloc 00:03:08.321 19:09:55 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:08.321 19:09:55 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:08.321 19:09:55 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:08.321 19:09:55 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:08.321 19:09:55 -- setup/hugepages.sh@51 -- # shift 00:03:08.321 19:09:55 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:08.321 19:09:55 -- setup/hugepages.sh@52 -- # local node_ids 00:03:08.321 19:09:55 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:08.321 19:09:55 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:08.321 19:09:55 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:08.321 19:09:55 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:08.321 19:09:55 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:08.321 19:09:55 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:08.321 19:09:55 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:08.321 19:09:55 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:08.321 19:09:55 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:08.321 19:09:55 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:08.321 19:09:55 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:08.321 19:09:55 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:08.321 19:09:55 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:08.321 19:09:55 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:08.321 19:09:55 -- setup/hugepages.sh@73 -- # return 0 00:03:08.321 19:09:55 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:08.321 19:09:55 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:08.321 19:09:55 -- setup/hugepages.sh@146 -- # setup output 00:03:08.321 19:09:55 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:08.321 19:09:55 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:10.850 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:10.850 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:10.850 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:10.850 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:11.109 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:13.017 19:09:59 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:13.017 19:09:59 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:13.017 19:09:59 -- setup/hugepages.sh@89 -- # local node 00:03:13.017 19:09:59 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:13.017 19:09:59 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:13.017 19:09:59 -- setup/hugepages.sh@92 -- # local surp 00:03:13.017 19:09:59 -- setup/hugepages.sh@93 -- # local resv 00:03:13.017 19:09:59 -- setup/hugepages.sh@94 -- # local anon 00:03:13.017 19:09:59 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:13.017 19:09:59 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:13.017 19:09:59 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:13.017 19:09:59 -- setup/common.sh@18 -- # local node= 00:03:13.017 19:09:59 -- setup/common.sh@19 -- # local var val 00:03:13.017 19:09:59 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.017 19:09:59 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.017 19:09:59 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.017 19:09:59 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.017 19:09:59 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.017 19:09:59 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73957356 kB' 'MemAvailable: 78983652 kB' 'Buffers: 20532 kB' 'Cached: 12921880 kB' 'SwapCached: 0 kB' 'Active: 8755652 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125416 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561864 kB' 'Mapped: 176968 kB' 'Shmem: 7566932 kB' 'KReclaimable: 487000 kB' 'Slab: 893504 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406504 kB' 'KernelStack: 16256 kB' 'PageTables: 9296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513184 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209580 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.017 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.017 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.018 19:09:59 -- setup/common.sh@33 -- # echo 0 00:03:13.018 19:09:59 -- setup/common.sh@33 -- # return 0 00:03:13.018 19:09:59 -- setup/hugepages.sh@97 -- # anon=0 00:03:13.018 19:09:59 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:13.018 19:09:59 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.018 19:09:59 -- setup/common.sh@18 -- # local node= 00:03:13.018 19:09:59 -- setup/common.sh@19 -- # local var val 00:03:13.018 19:09:59 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.018 19:09:59 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.018 19:09:59 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.018 19:09:59 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.018 19:09:59 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.018 19:09:59 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73957936 kB' 'MemAvailable: 78984232 kB' 'Buffers: 20532 kB' 'Cached: 12921884 kB' 'SwapCached: 0 kB' 'Active: 8755480 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125244 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561644 kB' 'Mapped: 176948 kB' 'Shmem: 7566936 kB' 'KReclaimable: 487000 kB' 'Slab: 893492 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406492 kB' 'KernelStack: 16192 kB' 'PageTables: 9332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209548 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.018 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.018 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 19:09:59 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.020 19:09:59 -- setup/common.sh@33 -- # echo 0 00:03:13.020 19:09:59 -- setup/common.sh@33 -- # return 0 00:03:13.020 19:09:59 -- setup/hugepages.sh@99 -- # surp=0 00:03:13.020 19:09:59 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:13.020 19:09:59 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:13.020 19:09:59 -- setup/common.sh@18 -- # local node= 00:03:13.020 19:09:59 -- setup/common.sh@19 -- # local var val 00:03:13.020 19:09:59 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.020 19:09:59 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.020 19:09:59 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.020 19:09:59 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.020 19:09:59 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.020 19:09:59 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73963268 kB' 'MemAvailable: 78989564 kB' 'Buffers: 20532 kB' 'Cached: 12921896 kB' 'SwapCached: 0 kB' 'Active: 8755544 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125308 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561652 kB' 'Mapped: 176948 kB' 'Shmem: 7566948 kB' 'KReclaimable: 487000 kB' 'Slab: 893492 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406492 kB' 'KernelStack: 16080 kB' 'PageTables: 8732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9511812 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209436 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 19:09:59 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:09:59 -- setup/common.sh@32 -- # continue 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:09:59 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.021 19:10:00 -- setup/common.sh@33 -- # echo 0 00:03:13.021 19:10:00 -- setup/common.sh@33 -- # return 0 00:03:13.021 19:10:00 -- setup/hugepages.sh@100 -- # resv=0 00:03:13.021 19:10:00 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:13.021 nr_hugepages=1024 00:03:13.021 19:10:00 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:13.021 resv_hugepages=0 00:03:13.021 19:10:00 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:13.021 surplus_hugepages=0 00:03:13.021 19:10:00 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:13.021 anon_hugepages=0 00:03:13.021 19:10:00 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:13.021 19:10:00 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:13.021 19:10:00 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:13.021 19:10:00 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:13.021 19:10:00 -- setup/common.sh@18 -- # local node= 00:03:13.021 19:10:00 -- setup/common.sh@19 -- # local var val 00:03:13.021 19:10:00 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.021 19:10:00 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.021 19:10:00 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.021 19:10:00 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.021 19:10:00 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.021 19:10:00 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73964540 kB' 'MemAvailable: 78990836 kB' 'Buffers: 20532 kB' 'Cached: 12921908 kB' 'SwapCached: 0 kB' 'Active: 8756156 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125920 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562324 kB' 'Mapped: 176948 kB' 'Shmem: 7566960 kB' 'KReclaimable: 487000 kB' 'Slab: 893492 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406492 kB' 'KernelStack: 16288 kB' 'PageTables: 9208 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209532 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.022 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.284 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.284 19:10:00 -- setup/common.sh@33 -- # echo 1024 00:03:13.284 19:10:00 -- setup/common.sh@33 -- # return 0 00:03:13.284 19:10:00 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:13.284 19:10:00 -- setup/hugepages.sh@112 -- # get_nodes 00:03:13.284 19:10:00 -- setup/hugepages.sh@27 -- # local node 00:03:13.284 19:10:00 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:13.284 19:10:00 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:13.284 19:10:00 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:13.284 19:10:00 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:13.284 19:10:00 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:13.284 19:10:00 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:13.284 19:10:00 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:13.284 19:10:00 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:13.284 19:10:00 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:13.284 19:10:00 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.284 19:10:00 -- setup/common.sh@18 -- # local node=0 00:03:13.284 19:10:00 -- setup/common.sh@19 -- # local var val 00:03:13.284 19:10:00 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.284 19:10:00 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.284 19:10:00 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:13.284 19:10:00 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:13.284 19:10:00 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.284 19:10:00 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.284 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 40790688 kB' 'MemUsed: 7326276 kB' 'SwapCached: 0 kB' 'Active: 3641648 kB' 'Inactive: 607904 kB' 'Active(anon): 3262012 kB' 'Inactive(anon): 0 kB' 'Active(file): 379636 kB' 'Inactive(file): 607904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3842704 kB' 'Mapped: 115900 kB' 'AnonPages: 410256 kB' 'Shmem: 2855164 kB' 'KernelStack: 10024 kB' 'PageTables: 5576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329924 kB' 'Slab: 568708 kB' 'SReclaimable: 329924 kB' 'SUnreclaim: 238784 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.285 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.285 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.285 19:10:00 -- setup/common.sh@33 -- # echo 0 00:03:13.285 19:10:00 -- setup/common.sh@33 -- # return 0 00:03:13.285 19:10:00 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:13.285 19:10:00 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:13.286 19:10:00 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:13.286 19:10:00 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:13.286 19:10:00 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.286 19:10:00 -- setup/common.sh@18 -- # local node=1 00:03:13.286 19:10:00 -- setup/common.sh@19 -- # local var val 00:03:13.286 19:10:00 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.286 19:10:00 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.286 19:10:00 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:13.286 19:10:00 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:13.286 19:10:00 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.286 19:10:00 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.286 19:10:00 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176564 kB' 'MemFree: 33176744 kB' 'MemUsed: 10999820 kB' 'SwapCached: 0 kB' 'Active: 5117684 kB' 'Inactive: 4137340 kB' 'Active(anon): 4867084 kB' 'Inactive(anon): 0 kB' 'Active(file): 250600 kB' 'Inactive(file): 4137340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9099764 kB' 'Mapped: 61048 kB' 'AnonPages: 155572 kB' 'Shmem: 4711824 kB' 'KernelStack: 6072 kB' 'PageTables: 2912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157076 kB' 'Slab: 324752 kB' 'SReclaimable: 157076 kB' 'SUnreclaim: 167676 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.286 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.286 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 19:10:00 -- setup/common.sh@32 -- # continue 00:03:13.287 19:10:00 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 19:10:00 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 19:10:00 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 19:10:00 -- setup/common.sh@33 -- # echo 0 00:03:13.287 19:10:00 -- setup/common.sh@33 -- # return 0 00:03:13.287 19:10:00 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:13.287 19:10:00 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:13.287 19:10:00 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:13.287 19:10:00 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:13.287 19:10:00 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:13.287 node0=512 expecting 512 00:03:13.287 19:10:00 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:13.287 19:10:00 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:13.287 19:10:00 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:13.287 19:10:00 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:13.287 node1=512 expecting 512 00:03:13.287 19:10:00 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:13.287 00:03:13.287 real 0m4.784s 00:03:13.287 user 0m1.406s 00:03:13.287 sys 0m3.222s 00:03:13.287 19:10:00 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:13.287 19:10:00 -- common/autotest_common.sh@10 -- # set +x 00:03:13.287 ************************************ 00:03:13.287 END TEST per_node_1G_alloc 00:03:13.287 ************************************ 00:03:13.287 19:10:00 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:13.287 19:10:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:13.287 19:10:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:13.287 19:10:00 -- common/autotest_common.sh@10 -- # set +x 00:03:13.546 ************************************ 00:03:13.546 START TEST even_2G_alloc 00:03:13.546 ************************************ 00:03:13.546 19:10:00 -- common/autotest_common.sh@1111 -- # even_2G_alloc 00:03:13.546 19:10:00 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:13.546 19:10:00 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:13.546 19:10:00 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:13.546 19:10:00 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:13.546 19:10:00 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:13.546 19:10:00 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:13.546 19:10:00 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:13.546 19:10:00 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:13.546 19:10:00 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:13.546 19:10:00 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:13.546 19:10:00 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:13.546 19:10:00 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:13.546 19:10:00 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:13.546 19:10:00 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:13.546 19:10:00 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:13.546 19:10:00 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:13.546 19:10:00 -- setup/hugepages.sh@83 -- # : 512 00:03:13.546 19:10:00 -- setup/hugepages.sh@84 -- # : 1 00:03:13.546 19:10:00 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:13.546 19:10:00 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:13.546 19:10:00 -- setup/hugepages.sh@83 -- # : 0 00:03:13.546 19:10:00 -- setup/hugepages.sh@84 -- # : 0 00:03:13.546 19:10:00 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:13.546 19:10:00 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:13.546 19:10:00 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:13.546 19:10:00 -- setup/hugepages.sh@153 -- # setup output 00:03:13.546 19:10:00 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:13.546 19:10:00 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:16.834 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:16.834 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.834 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:19.374 19:10:05 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:19.374 19:10:05 -- setup/hugepages.sh@89 -- # local node 00:03:19.374 19:10:05 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:19.374 19:10:05 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:19.374 19:10:05 -- setup/hugepages.sh@92 -- # local surp 00:03:19.374 19:10:05 -- setup/hugepages.sh@93 -- # local resv 00:03:19.374 19:10:05 -- setup/hugepages.sh@94 -- # local anon 00:03:19.374 19:10:05 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:19.374 19:10:05 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:19.374 19:10:05 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:19.374 19:10:05 -- setup/common.sh@18 -- # local node= 00:03:19.374 19:10:05 -- setup/common.sh@19 -- # local var val 00:03:19.374 19:10:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.374 19:10:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.374 19:10:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.374 19:10:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.374 19:10:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.374 19:10:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73971484 kB' 'MemAvailable: 78997780 kB' 'Buffers: 20532 kB' 'Cached: 12922032 kB' 'SwapCached: 0 kB' 'Active: 8756264 kB' 'Inactive: 4745244 kB' 'Active(anon): 8126028 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562316 kB' 'Mapped: 177060 kB' 'Shmem: 7567084 kB' 'KReclaimable: 487000 kB' 'Slab: 893840 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406840 kB' 'KernelStack: 16016 kB' 'PageTables: 8424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9511428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209452 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.374 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.374 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.375 19:10:05 -- setup/common.sh@33 -- # echo 0 00:03:19.375 19:10:05 -- setup/common.sh@33 -- # return 0 00:03:19.375 19:10:05 -- setup/hugepages.sh@97 -- # anon=0 00:03:19.375 19:10:05 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:19.375 19:10:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:19.375 19:10:05 -- setup/common.sh@18 -- # local node= 00:03:19.375 19:10:05 -- setup/common.sh@19 -- # local var val 00:03:19.375 19:10:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.375 19:10:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.375 19:10:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.375 19:10:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.375 19:10:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.375 19:10:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73971732 kB' 'MemAvailable: 78998028 kB' 'Buffers: 20532 kB' 'Cached: 12922032 kB' 'SwapCached: 0 kB' 'Active: 8755532 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125296 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561516 kB' 'Mapped: 177024 kB' 'Shmem: 7567084 kB' 'KReclaimable: 487000 kB' 'Slab: 893848 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406848 kB' 'KernelStack: 16000 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9511440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209436 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.375 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.375 19:10:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.376 19:10:05 -- setup/common.sh@33 -- # echo 0 00:03:19.376 19:10:05 -- setup/common.sh@33 -- # return 0 00:03:19.376 19:10:05 -- setup/hugepages.sh@99 -- # surp=0 00:03:19.376 19:10:05 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:19.376 19:10:05 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:19.376 19:10:05 -- setup/common.sh@18 -- # local node= 00:03:19.376 19:10:05 -- setup/common.sh@19 -- # local var val 00:03:19.376 19:10:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.376 19:10:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.376 19:10:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.376 19:10:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.376 19:10:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.376 19:10:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73972044 kB' 'MemAvailable: 78998340 kB' 'Buffers: 20532 kB' 'Cached: 12922032 kB' 'SwapCached: 0 kB' 'Active: 8755564 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125328 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561548 kB' 'Mapped: 177024 kB' 'Shmem: 7567084 kB' 'KReclaimable: 487000 kB' 'Slab: 893828 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406828 kB' 'KernelStack: 16016 kB' 'PageTables: 8448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9511456 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209436 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.376 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.376 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.377 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.377 19:10:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.378 19:10:05 -- setup/common.sh@33 -- # echo 0 00:03:19.378 19:10:05 -- setup/common.sh@33 -- # return 0 00:03:19.378 19:10:05 -- setup/hugepages.sh@100 -- # resv=0 00:03:19.378 19:10:05 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:19.378 nr_hugepages=1024 00:03:19.378 19:10:05 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:19.378 resv_hugepages=0 00:03:19.378 19:10:05 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:19.378 surplus_hugepages=0 00:03:19.378 19:10:05 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:19.378 anon_hugepages=0 00:03:19.378 19:10:05 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:19.378 19:10:05 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:19.378 19:10:05 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:19.378 19:10:05 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:19.378 19:10:05 -- setup/common.sh@18 -- # local node= 00:03:19.378 19:10:05 -- setup/common.sh@19 -- # local var val 00:03:19.378 19:10:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.378 19:10:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.378 19:10:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.378 19:10:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.378 19:10:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.378 19:10:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73971424 kB' 'MemAvailable: 78997720 kB' 'Buffers: 20532 kB' 'Cached: 12922060 kB' 'SwapCached: 0 kB' 'Active: 8755648 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125412 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561516 kB' 'Mapped: 177024 kB' 'Shmem: 7567112 kB' 'KReclaimable: 487000 kB' 'Slab: 893828 kB' 'SReclaimable: 487000 kB' 'SUnreclaim: 406828 kB' 'KernelStack: 16000 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9511468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209436 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.378 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.378 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.379 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.379 19:10:05 -- setup/common.sh@33 -- # echo 1024 00:03:19.379 19:10:05 -- setup/common.sh@33 -- # return 0 00:03:19.379 19:10:05 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:19.379 19:10:05 -- setup/hugepages.sh@112 -- # get_nodes 00:03:19.379 19:10:05 -- setup/hugepages.sh@27 -- # local node 00:03:19.379 19:10:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:19.379 19:10:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:19.379 19:10:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:19.379 19:10:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:19.379 19:10:05 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:19.379 19:10:05 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:19.379 19:10:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:19.379 19:10:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:19.379 19:10:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:19.379 19:10:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:19.379 19:10:05 -- setup/common.sh@18 -- # local node=0 00:03:19.379 19:10:05 -- setup/common.sh@19 -- # local var val 00:03:19.379 19:10:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.379 19:10:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.379 19:10:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:19.379 19:10:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:19.379 19:10:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.379 19:10:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.379 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 40800596 kB' 'MemUsed: 7316368 kB' 'SwapCached: 0 kB' 'Active: 3639632 kB' 'Inactive: 607904 kB' 'Active(anon): 3259996 kB' 'Inactive(anon): 0 kB' 'Active(file): 379636 kB' 'Inactive(file): 607904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3842732 kB' 'Mapped: 115976 kB' 'AnonPages: 407916 kB' 'Shmem: 2855192 kB' 'KernelStack: 9960 kB' 'PageTables: 5444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329924 kB' 'Slab: 568948 kB' 'SReclaimable: 329924 kB' 'SUnreclaim: 239024 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.380 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.380 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.380 19:10:05 -- setup/common.sh@33 -- # echo 0 00:03:19.380 19:10:05 -- setup/common.sh@33 -- # return 0 00:03:19.380 19:10:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:19.380 19:10:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:19.380 19:10:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:19.380 19:10:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:19.380 19:10:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:19.380 19:10:05 -- setup/common.sh@18 -- # local node=1 00:03:19.380 19:10:05 -- setup/common.sh@19 -- # local var val 00:03:19.380 19:10:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.380 19:10:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.380 19:10:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:19.380 19:10:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:19.380 19:10:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.381 19:10:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176564 kB' 'MemFree: 33170576 kB' 'MemUsed: 11005988 kB' 'SwapCached: 0 kB' 'Active: 5115692 kB' 'Inactive: 4137340 kB' 'Active(anon): 4865092 kB' 'Inactive(anon): 0 kB' 'Active(file): 250600 kB' 'Inactive(file): 4137340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9099896 kB' 'Mapped: 61048 kB' 'AnonPages: 153220 kB' 'Shmem: 4711956 kB' 'KernelStack: 6024 kB' 'PageTables: 2916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157076 kB' 'Slab: 324880 kB' 'SReclaimable: 157076 kB' 'SUnreclaim: 167804 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # continue 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.381 19:10:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.381 19:10:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.381 19:10:05 -- setup/common.sh@33 -- # echo 0 00:03:19.381 19:10:05 -- setup/common.sh@33 -- # return 0 00:03:19.381 19:10:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:19.381 19:10:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:19.381 19:10:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:19.382 19:10:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:19.382 19:10:05 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:19.382 node0=512 expecting 512 00:03:19.382 19:10:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:19.382 19:10:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:19.382 19:10:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:19.382 19:10:05 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:19.382 node1=512 expecting 512 00:03:19.382 19:10:05 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:19.382 00:03:19.382 real 0m5.649s 00:03:19.382 user 0m2.036s 00:03:19.382 sys 0m3.596s 00:03:19.382 19:10:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:19.382 19:10:05 -- common/autotest_common.sh@10 -- # set +x 00:03:19.382 ************************************ 00:03:19.382 END TEST even_2G_alloc 00:03:19.382 ************************************ 00:03:19.382 19:10:06 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:19.382 19:10:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:19.382 19:10:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:19.382 19:10:06 -- common/autotest_common.sh@10 -- # set +x 00:03:19.382 ************************************ 00:03:19.382 START TEST odd_alloc 00:03:19.382 ************************************ 00:03:19.382 19:10:06 -- common/autotest_common.sh@1111 -- # odd_alloc 00:03:19.382 19:10:06 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:19.382 19:10:06 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:19.382 19:10:06 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:19.382 19:10:06 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:19.382 19:10:06 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:19.382 19:10:06 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:19.382 19:10:06 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:19.382 19:10:06 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:19.382 19:10:06 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:19.382 19:10:06 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:19.382 19:10:06 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:19.382 19:10:06 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:19.382 19:10:06 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:19.382 19:10:06 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:19.382 19:10:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:19.382 19:10:06 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:19.382 19:10:06 -- setup/hugepages.sh@83 -- # : 513 00:03:19.382 19:10:06 -- setup/hugepages.sh@84 -- # : 1 00:03:19.382 19:10:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:19.382 19:10:06 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:19.382 19:10:06 -- setup/hugepages.sh@83 -- # : 0 00:03:19.382 19:10:06 -- setup/hugepages.sh@84 -- # : 0 00:03:19.382 19:10:06 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:19.382 19:10:06 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:19.382 19:10:06 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:19.382 19:10:06 -- setup/hugepages.sh@160 -- # setup output 00:03:19.382 19:10:06 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:19.382 19:10:06 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:22.669 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:22.669 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:22.669 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:24.581 19:10:11 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:24.581 19:10:11 -- setup/hugepages.sh@89 -- # local node 00:03:24.581 19:10:11 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:24.581 19:10:11 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:24.581 19:10:11 -- setup/hugepages.sh@92 -- # local surp 00:03:24.581 19:10:11 -- setup/hugepages.sh@93 -- # local resv 00:03:24.581 19:10:11 -- setup/hugepages.sh@94 -- # local anon 00:03:24.581 19:10:11 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:24.581 19:10:11 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:24.581 19:10:11 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:24.581 19:10:11 -- setup/common.sh@18 -- # local node= 00:03:24.581 19:10:11 -- setup/common.sh@19 -- # local var val 00:03:24.581 19:10:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.581 19:10:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.581 19:10:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.581 19:10:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.581 19:10:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.581 19:10:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73972604 kB' 'MemAvailable: 78998868 kB' 'Buffers: 20532 kB' 'Cached: 12922184 kB' 'SwapCached: 0 kB' 'Active: 8756536 kB' 'Inactive: 4745244 kB' 'Active(anon): 8126300 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 562008 kB' 'Mapped: 177232 kB' 'Shmem: 7567236 kB' 'KReclaimable: 486968 kB' 'Slab: 894160 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 407192 kB' 'KernelStack: 16096 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 9512224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209484 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.581 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.581 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.582 19:10:11 -- setup/common.sh@33 -- # echo 0 00:03:24.582 19:10:11 -- setup/common.sh@33 -- # return 0 00:03:24.582 19:10:11 -- setup/hugepages.sh@97 -- # anon=0 00:03:24.582 19:10:11 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:24.582 19:10:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.582 19:10:11 -- setup/common.sh@18 -- # local node= 00:03:24.582 19:10:11 -- setup/common.sh@19 -- # local var val 00:03:24.582 19:10:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.582 19:10:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.582 19:10:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.582 19:10:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.582 19:10:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.582 19:10:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73973152 kB' 'MemAvailable: 78999416 kB' 'Buffers: 20532 kB' 'Cached: 12922188 kB' 'SwapCached: 0 kB' 'Active: 8755804 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125568 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561776 kB' 'Mapped: 177112 kB' 'Shmem: 7567240 kB' 'KReclaimable: 486968 kB' 'Slab: 894152 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 407184 kB' 'KernelStack: 16112 kB' 'PageTables: 8644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 9512236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209452 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.582 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.582 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.583 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.583 19:10:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.584 19:10:11 -- setup/common.sh@33 -- # echo 0 00:03:24.584 19:10:11 -- setup/common.sh@33 -- # return 0 00:03:24.584 19:10:11 -- setup/hugepages.sh@99 -- # surp=0 00:03:24.584 19:10:11 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:24.584 19:10:11 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:24.584 19:10:11 -- setup/common.sh@18 -- # local node= 00:03:24.584 19:10:11 -- setup/common.sh@19 -- # local var val 00:03:24.584 19:10:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.584 19:10:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.584 19:10:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.584 19:10:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.584 19:10:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.584 19:10:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.584 19:10:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73973592 kB' 'MemAvailable: 78999856 kB' 'Buffers: 20532 kB' 'Cached: 12922200 kB' 'SwapCached: 0 kB' 'Active: 8755776 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125540 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561744 kB' 'Mapped: 177112 kB' 'Shmem: 7567252 kB' 'KReclaimable: 486968 kB' 'Slab: 894152 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 407184 kB' 'KernelStack: 16096 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 9512252 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209452 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.584 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.584 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.585 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.585 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.586 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.586 19:10:11 -- setup/common.sh@33 -- # echo 0 00:03:24.586 19:10:11 -- setup/common.sh@33 -- # return 0 00:03:24.586 19:10:11 -- setup/hugepages.sh@100 -- # resv=0 00:03:24.586 19:10:11 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:24.586 nr_hugepages=1025 00:03:24.586 19:10:11 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:24.586 resv_hugepages=0 00:03:24.586 19:10:11 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:24.586 surplus_hugepages=0 00:03:24.586 19:10:11 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:24.586 anon_hugepages=0 00:03:24.586 19:10:11 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:24.586 19:10:11 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:24.586 19:10:11 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:24.586 19:10:11 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:24.586 19:10:11 -- setup/common.sh@18 -- # local node= 00:03:24.586 19:10:11 -- setup/common.sh@19 -- # local var val 00:03:24.586 19:10:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.586 19:10:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.586 19:10:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.586 19:10:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.586 19:10:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.586 19:10:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.586 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73973592 kB' 'MemAvailable: 78999856 kB' 'Buffers: 20532 kB' 'Cached: 12922212 kB' 'SwapCached: 0 kB' 'Active: 8755540 kB' 'Inactive: 4745244 kB' 'Active(anon): 8125304 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 561448 kB' 'Mapped: 177112 kB' 'Shmem: 7567264 kB' 'KReclaimable: 486968 kB' 'Slab: 894152 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 407184 kB' 'KernelStack: 16080 kB' 'PageTables: 8556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485768 kB' 'Committed_AS: 9512264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209468 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.587 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.587 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.588 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.588 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.589 19:10:11 -- setup/common.sh@33 -- # echo 1025 00:03:24.589 19:10:11 -- setup/common.sh@33 -- # return 0 00:03:24.589 19:10:11 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:24.589 19:10:11 -- setup/hugepages.sh@112 -- # get_nodes 00:03:24.589 19:10:11 -- setup/hugepages.sh@27 -- # local node 00:03:24.589 19:10:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.589 19:10:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:24.589 19:10:11 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.589 19:10:11 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:24.589 19:10:11 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:24.589 19:10:11 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:24.589 19:10:11 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:24.589 19:10:11 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:24.589 19:10:11 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:24.589 19:10:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.589 19:10:11 -- setup/common.sh@18 -- # local node=0 00:03:24.589 19:10:11 -- setup/common.sh@19 -- # local var val 00:03:24.589 19:10:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.589 19:10:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.589 19:10:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:24.589 19:10:11 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:24.589 19:10:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.589 19:10:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 40802388 kB' 'MemUsed: 7314576 kB' 'SwapCached: 0 kB' 'Active: 3639752 kB' 'Inactive: 607904 kB' 'Active(anon): 3260116 kB' 'Inactive(anon): 0 kB' 'Active(file): 379636 kB' 'Inactive(file): 607904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3842772 kB' 'Mapped: 116064 kB' 'AnonPages: 408024 kB' 'Shmem: 2855232 kB' 'KernelStack: 10056 kB' 'PageTables: 5648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329924 kB' 'Slab: 569124 kB' 'SReclaimable: 329924 kB' 'SUnreclaim: 239200 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.589 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.589 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.590 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.590 19:10:11 -- setup/common.sh@33 -- # echo 0 00:03:24.590 19:10:11 -- setup/common.sh@33 -- # return 0 00:03:24.590 19:10:11 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:24.590 19:10:11 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:24.590 19:10:11 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:24.590 19:10:11 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:24.590 19:10:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.590 19:10:11 -- setup/common.sh@18 -- # local node=1 00:03:24.590 19:10:11 -- setup/common.sh@19 -- # local var val 00:03:24.590 19:10:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.590 19:10:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.590 19:10:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:24.590 19:10:11 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:24.590 19:10:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.590 19:10:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.590 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176564 kB' 'MemFree: 33171752 kB' 'MemUsed: 11004812 kB' 'SwapCached: 0 kB' 'Active: 5116096 kB' 'Inactive: 4137340 kB' 'Active(anon): 4865496 kB' 'Inactive(anon): 0 kB' 'Active(file): 250600 kB' 'Inactive(file): 4137340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9100000 kB' 'Mapped: 61048 kB' 'AnonPages: 153708 kB' 'Shmem: 4712060 kB' 'KernelStack: 6040 kB' 'PageTables: 2952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157044 kB' 'Slab: 325028 kB' 'SReclaimable: 157044 kB' 'SUnreclaim: 167984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.591 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.591 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # continue 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.592 19:10:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.592 19:10:11 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.592 19:10:11 -- setup/common.sh@33 -- # echo 0 00:03:24.592 19:10:11 -- setup/common.sh@33 -- # return 0 00:03:24.592 19:10:11 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:24.592 19:10:11 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:24.592 19:10:11 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:24.592 19:10:11 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:24.592 19:10:11 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:24.592 node0=512 expecting 513 00:03:24.592 19:10:11 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:24.592 19:10:11 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:24.592 19:10:11 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:24.592 19:10:11 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:24.592 node1=513 expecting 512 00:03:24.592 19:10:11 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:24.592 00:03:24.592 real 0m5.326s 00:03:24.592 user 0m2.035s 00:03:24.592 sys 0m3.323s 00:03:24.592 19:10:11 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:24.592 19:10:11 -- common/autotest_common.sh@10 -- # set +x 00:03:24.592 ************************************ 00:03:24.592 END TEST odd_alloc 00:03:24.592 ************************************ 00:03:24.592 19:10:11 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:24.592 19:10:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:24.592 19:10:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:24.592 19:10:11 -- common/autotest_common.sh@10 -- # set +x 00:03:24.852 ************************************ 00:03:24.852 START TEST custom_alloc 00:03:24.852 ************************************ 00:03:24.852 19:10:11 -- common/autotest_common.sh@1111 -- # custom_alloc 00:03:24.852 19:10:11 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:24.852 19:10:11 -- setup/hugepages.sh@169 -- # local node 00:03:24.852 19:10:11 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:24.852 19:10:11 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:24.852 19:10:11 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:24.852 19:10:11 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:24.852 19:10:11 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:24.852 19:10:11 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:24.852 19:10:11 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:24.852 19:10:11 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:24.852 19:10:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:24.852 19:10:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:24.852 19:10:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:24.852 19:10:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:24.852 19:10:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:24.852 19:10:11 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:24.852 19:10:11 -- setup/hugepages.sh@83 -- # : 256 00:03:24.852 19:10:11 -- setup/hugepages.sh@84 -- # : 1 00:03:24.852 19:10:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:24.852 19:10:11 -- setup/hugepages.sh@83 -- # : 0 00:03:24.852 19:10:11 -- setup/hugepages.sh@84 -- # : 0 00:03:24.852 19:10:11 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:24.852 19:10:11 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:24.852 19:10:11 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:24.852 19:10:11 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:24.852 19:10:11 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:24.852 19:10:11 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:24.852 19:10:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:24.852 19:10:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:24.852 19:10:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:24.852 19:10:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:24.852 19:10:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:24.852 19:10:11 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:24.852 19:10:11 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:24.852 19:10:11 -- setup/hugepages.sh@78 -- # return 0 00:03:24.852 19:10:11 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:24.852 19:10:11 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:24.852 19:10:11 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:24.852 19:10:11 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:24.852 19:10:11 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:24.852 19:10:11 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:24.852 19:10:11 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:24.852 19:10:11 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:24.852 19:10:11 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:24.852 19:10:11 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:24.852 19:10:11 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:24.852 19:10:11 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:24.852 19:10:11 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:24.852 19:10:11 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:24.852 19:10:11 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:24.852 19:10:11 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:24.852 19:10:11 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:24.852 19:10:11 -- setup/hugepages.sh@78 -- # return 0 00:03:24.852 19:10:11 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:24.852 19:10:11 -- setup/hugepages.sh@187 -- # setup output 00:03:24.852 19:10:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:24.852 19:10:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:29.042 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:29.042 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:29.042 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:30.424 19:10:17 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:30.424 19:10:17 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:30.424 19:10:17 -- setup/hugepages.sh@89 -- # local node 00:03:30.424 19:10:17 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:30.424 19:10:17 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:30.424 19:10:17 -- setup/hugepages.sh@92 -- # local surp 00:03:30.424 19:10:17 -- setup/hugepages.sh@93 -- # local resv 00:03:30.424 19:10:17 -- setup/hugepages.sh@94 -- # local anon 00:03:30.424 19:10:17 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:30.424 19:10:17 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:30.424 19:10:17 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:30.424 19:10:17 -- setup/common.sh@18 -- # local node= 00:03:30.424 19:10:17 -- setup/common.sh@19 -- # local var val 00:03:30.424 19:10:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.424 19:10:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.424 19:10:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.424 19:10:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.424 19:10:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.424 19:10:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 72915852 kB' 'MemAvailable: 77942116 kB' 'Buffers: 20532 kB' 'Cached: 12922344 kB' 'SwapCached: 0 kB' 'Active: 8757428 kB' 'Inactive: 4745244 kB' 'Active(anon): 8127192 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563024 kB' 'Mapped: 177260 kB' 'Shmem: 7567396 kB' 'KReclaimable: 486968 kB' 'Slab: 893940 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 406972 kB' 'KernelStack: 16160 kB' 'PageTables: 8896 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 9513732 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209612 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.424 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.424 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:30.425 19:10:17 -- setup/common.sh@33 -- # echo 0 00:03:30.425 19:10:17 -- setup/common.sh@33 -- # return 0 00:03:30.425 19:10:17 -- setup/hugepages.sh@97 -- # anon=0 00:03:30.425 19:10:17 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:30.425 19:10:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:30.425 19:10:17 -- setup/common.sh@18 -- # local node= 00:03:30.425 19:10:17 -- setup/common.sh@19 -- # local var val 00:03:30.425 19:10:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.425 19:10:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.425 19:10:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.425 19:10:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.425 19:10:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.425 19:10:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 72915984 kB' 'MemAvailable: 77942248 kB' 'Buffers: 20532 kB' 'Cached: 12922348 kB' 'SwapCached: 0 kB' 'Active: 8757564 kB' 'Inactive: 4745244 kB' 'Active(anon): 8127328 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563148 kB' 'Mapped: 177192 kB' 'Shmem: 7567400 kB' 'KReclaimable: 486968 kB' 'Slab: 893940 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 406972 kB' 'KernelStack: 16352 kB' 'PageTables: 9304 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 9515112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209628 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.425 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.425 19:10:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.426 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.426 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.427 19:10:17 -- setup/common.sh@33 -- # echo 0 00:03:30.427 19:10:17 -- setup/common.sh@33 -- # return 0 00:03:30.427 19:10:17 -- setup/hugepages.sh@99 -- # surp=0 00:03:30.427 19:10:17 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:30.427 19:10:17 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:30.427 19:10:17 -- setup/common.sh@18 -- # local node= 00:03:30.427 19:10:17 -- setup/common.sh@19 -- # local var val 00:03:30.427 19:10:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.427 19:10:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.427 19:10:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.427 19:10:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.427 19:10:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.427 19:10:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 72917000 kB' 'MemAvailable: 77943264 kB' 'Buffers: 20532 kB' 'Cached: 12922348 kB' 'SwapCached: 0 kB' 'Active: 8757832 kB' 'Inactive: 4745244 kB' 'Active(anon): 8127596 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563412 kB' 'Mapped: 177192 kB' 'Shmem: 7567400 kB' 'KReclaimable: 486968 kB' 'Slab: 893908 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 406940 kB' 'KernelStack: 16496 kB' 'PageTables: 9468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 9514948 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209660 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.427 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.427 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:30.428 19:10:17 -- setup/common.sh@33 -- # echo 0 00:03:30.428 19:10:17 -- setup/common.sh@33 -- # return 0 00:03:30.428 19:10:17 -- setup/hugepages.sh@100 -- # resv=0 00:03:30.428 19:10:17 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:30.428 nr_hugepages=1536 00:03:30.428 19:10:17 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:30.428 resv_hugepages=0 00:03:30.428 19:10:17 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:30.428 surplus_hugepages=0 00:03:30.428 19:10:17 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:30.428 anon_hugepages=0 00:03:30.428 19:10:17 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:30.428 19:10:17 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:30.428 19:10:17 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:30.428 19:10:17 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:30.428 19:10:17 -- setup/common.sh@18 -- # local node= 00:03:30.428 19:10:17 -- setup/common.sh@19 -- # local var val 00:03:30.428 19:10:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.428 19:10:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.428 19:10:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:30.428 19:10:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:30.428 19:10:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.428 19:10:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 72915992 kB' 'MemAvailable: 77942256 kB' 'Buffers: 20532 kB' 'Cached: 12922356 kB' 'SwapCached: 0 kB' 'Active: 8759016 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128780 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564616 kB' 'Mapped: 177192 kB' 'Shmem: 7567408 kB' 'KReclaimable: 486968 kB' 'Slab: 893932 kB' 'SReclaimable: 486968 kB' 'SUnreclaim: 406964 kB' 'KernelStack: 16768 kB' 'PageTables: 10584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962504 kB' 'Committed_AS: 9515508 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209692 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.428 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.428 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.429 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.429 19:10:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:30.430 19:10:17 -- setup/common.sh@33 -- # echo 1536 00:03:30.430 19:10:17 -- setup/common.sh@33 -- # return 0 00:03:30.430 19:10:17 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:30.430 19:10:17 -- setup/hugepages.sh@112 -- # get_nodes 00:03:30.430 19:10:17 -- setup/hugepages.sh@27 -- # local node 00:03:30.430 19:10:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:30.430 19:10:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:30.430 19:10:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:30.430 19:10:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:30.430 19:10:17 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:30.430 19:10:17 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:30.430 19:10:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:30.430 19:10:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:30.430 19:10:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:30.430 19:10:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:30.430 19:10:17 -- setup/common.sh@18 -- # local node=0 00:03:30.430 19:10:17 -- setup/common.sh@19 -- # local var val 00:03:30.430 19:10:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.430 19:10:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.430 19:10:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:30.430 19:10:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:30.430 19:10:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.430 19:10:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 40804120 kB' 'MemUsed: 7312844 kB' 'SwapCached: 0 kB' 'Active: 3642040 kB' 'Inactive: 607904 kB' 'Active(anon): 3262404 kB' 'Inactive(anon): 0 kB' 'Active(file): 379636 kB' 'Inactive(file): 607904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3842840 kB' 'Mapped: 116144 kB' 'AnonPages: 410192 kB' 'Shmem: 2855300 kB' 'KernelStack: 10488 kB' 'PageTables: 7020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329924 kB' 'Slab: 568864 kB' 'SReclaimable: 329924 kB' 'SUnreclaim: 238940 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.430 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.430 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@33 -- # echo 0 00:03:30.431 19:10:17 -- setup/common.sh@33 -- # return 0 00:03:30.431 19:10:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:30.431 19:10:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:30.431 19:10:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:30.431 19:10:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:30.431 19:10:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:30.431 19:10:17 -- setup/common.sh@18 -- # local node=1 00:03:30.431 19:10:17 -- setup/common.sh@19 -- # local var val 00:03:30.431 19:10:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:30.431 19:10:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:30.431 19:10:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:30.431 19:10:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:30.431 19:10:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:30.431 19:10:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176564 kB' 'MemFree: 32111908 kB' 'MemUsed: 12064656 kB' 'SwapCached: 0 kB' 'Active: 5116656 kB' 'Inactive: 4137340 kB' 'Active(anon): 4866056 kB' 'Inactive(anon): 0 kB' 'Active(file): 250600 kB' 'Inactive(file): 4137340 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9100088 kB' 'Mapped: 61048 kB' 'AnonPages: 154088 kB' 'Shmem: 4712148 kB' 'KernelStack: 6040 kB' 'PageTables: 2968 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 157044 kB' 'Slab: 324912 kB' 'SReclaimable: 157044 kB' 'SUnreclaim: 167868 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.431 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.431 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # continue 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:30.432 19:10:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:30.432 19:10:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:30.432 19:10:17 -- setup/common.sh@33 -- # echo 0 00:03:30.432 19:10:17 -- setup/common.sh@33 -- # return 0 00:03:30.432 19:10:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:30.432 19:10:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:30.432 19:10:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:30.432 19:10:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:30.432 19:10:17 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:30.432 node0=512 expecting 512 00:03:30.432 19:10:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:30.432 19:10:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:30.432 19:10:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:30.432 19:10:17 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:30.432 node1=1024 expecting 1024 00:03:30.432 19:10:17 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:30.432 00:03:30.432 real 0m5.685s 00:03:30.432 user 0m2.078s 00:03:30.432 sys 0m3.644s 00:03:30.432 19:10:17 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:30.432 19:10:17 -- common/autotest_common.sh@10 -- # set +x 00:03:30.432 ************************************ 00:03:30.432 END TEST custom_alloc 00:03:30.432 ************************************ 00:03:30.692 19:10:17 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:30.692 19:10:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:30.692 19:10:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:30.692 19:10:17 -- common/autotest_common.sh@10 -- # set +x 00:03:30.692 ************************************ 00:03:30.692 START TEST no_shrink_alloc 00:03:30.692 ************************************ 00:03:30.692 19:10:17 -- common/autotest_common.sh@1111 -- # no_shrink_alloc 00:03:30.692 19:10:17 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:30.692 19:10:17 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:30.692 19:10:17 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:30.692 19:10:17 -- setup/hugepages.sh@51 -- # shift 00:03:30.692 19:10:17 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:30.692 19:10:17 -- setup/hugepages.sh@52 -- # local node_ids 00:03:30.692 19:10:17 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:30.692 19:10:17 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:30.692 19:10:17 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:30.692 19:10:17 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:30.692 19:10:17 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:30.692 19:10:17 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:30.692 19:10:17 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:30.692 19:10:17 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:30.692 19:10:17 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:30.692 19:10:17 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:30.692 19:10:17 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:30.692 19:10:17 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:30.692 19:10:17 -- setup/hugepages.sh@73 -- # return 0 00:03:30.692 19:10:17 -- setup/hugepages.sh@198 -- # setup output 00:03:30.692 19:10:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:30.692 19:10:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:33.976 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:33.976 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:33.976 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:35.881 19:10:22 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:35.881 19:10:22 -- setup/hugepages.sh@89 -- # local node 00:03:35.881 19:10:22 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:35.881 19:10:22 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:35.881 19:10:22 -- setup/hugepages.sh@92 -- # local surp 00:03:35.881 19:10:22 -- setup/hugepages.sh@93 -- # local resv 00:03:35.881 19:10:22 -- setup/hugepages.sh@94 -- # local anon 00:03:35.882 19:10:22 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:35.882 19:10:22 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:35.882 19:10:22 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:35.882 19:10:22 -- setup/common.sh@18 -- # local node= 00:03:35.882 19:10:22 -- setup/common.sh@19 -- # local var val 00:03:35.882 19:10:22 -- setup/common.sh@20 -- # local mem_f mem 00:03:35.882 19:10:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.882 19:10:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.882 19:10:22 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.882 19:10:22 -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.882 19:10:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73994780 kB' 'MemAvailable: 79021012 kB' 'Buffers: 20532 kB' 'Cached: 12922504 kB' 'SwapCached: 0 kB' 'Active: 8758596 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128360 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564272 kB' 'Mapped: 177236 kB' 'Shmem: 7567556 kB' 'KReclaimable: 486936 kB' 'Slab: 893572 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406636 kB' 'KernelStack: 16080 kB' 'PageTables: 8532 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513716 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209532 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.882 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.882 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # continue 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:35.883 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:35.883 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.145 19:10:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:36.145 19:10:22 -- setup/common.sh@33 -- # echo 0 00:03:36.145 19:10:22 -- setup/common.sh@33 -- # return 0 00:03:36.145 19:10:22 -- setup/hugepages.sh@97 -- # anon=0 00:03:36.145 19:10:22 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:36.145 19:10:22 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:36.145 19:10:22 -- setup/common.sh@18 -- # local node= 00:03:36.145 19:10:22 -- setup/common.sh@19 -- # local var val 00:03:36.145 19:10:22 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.145 19:10:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.145 19:10:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.145 19:10:22 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.145 19:10:22 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.145 19:10:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.145 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73997104 kB' 'MemAvailable: 79023336 kB' 'Buffers: 20532 kB' 'Cached: 12922508 kB' 'SwapCached: 0 kB' 'Active: 8758624 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128388 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564268 kB' 'Mapped: 177232 kB' 'Shmem: 7567560 kB' 'KReclaimable: 486936 kB' 'Slab: 893572 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406636 kB' 'KernelStack: 16128 kB' 'PageTables: 8664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209484 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.146 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.146 19:10:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.147 19:10:22 -- setup/common.sh@33 -- # echo 0 00:03:36.147 19:10:22 -- setup/common.sh@33 -- # return 0 00:03:36.147 19:10:22 -- setup/hugepages.sh@99 -- # surp=0 00:03:36.147 19:10:22 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:36.147 19:10:22 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:36.147 19:10:22 -- setup/common.sh@18 -- # local node= 00:03:36.147 19:10:22 -- setup/common.sh@19 -- # local var val 00:03:36.147 19:10:22 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.147 19:10:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.147 19:10:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.147 19:10:22 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.147 19:10:22 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.147 19:10:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73996544 kB' 'MemAvailable: 79022776 kB' 'Buffers: 20532 kB' 'Cached: 12922508 kB' 'SwapCached: 0 kB' 'Active: 8758548 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128312 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564140 kB' 'Mapped: 177232 kB' 'Shmem: 7567560 kB' 'KReclaimable: 486936 kB' 'Slab: 893612 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406676 kB' 'KernelStack: 16128 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209484 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.147 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.147 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.148 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.148 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:36.149 19:10:22 -- setup/common.sh@33 -- # echo 0 00:03:36.149 19:10:22 -- setup/common.sh@33 -- # return 0 00:03:36.149 19:10:22 -- setup/hugepages.sh@100 -- # resv=0 00:03:36.149 19:10:22 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:36.149 nr_hugepages=1024 00:03:36.149 19:10:22 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:36.149 resv_hugepages=0 00:03:36.149 19:10:22 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:36.149 surplus_hugepages=0 00:03:36.149 19:10:22 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:36.149 anon_hugepages=0 00:03:36.149 19:10:22 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:36.149 19:10:22 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:36.149 19:10:22 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:36.149 19:10:22 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:36.149 19:10:22 -- setup/common.sh@18 -- # local node= 00:03:36.149 19:10:22 -- setup/common.sh@19 -- # local var val 00:03:36.149 19:10:22 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.149 19:10:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.149 19:10:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:36.149 19:10:22 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:36.149 19:10:22 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.149 19:10:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73995884 kB' 'MemAvailable: 79022116 kB' 'Buffers: 20532 kB' 'Cached: 12922532 kB' 'SwapCached: 0 kB' 'Active: 8758536 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128300 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564108 kB' 'Mapped: 177232 kB' 'Shmem: 7567584 kB' 'KReclaimable: 486936 kB' 'Slab: 893612 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406676 kB' 'KernelStack: 16112 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209484 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.149 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.149 19:10:22 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.150 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.150 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.151 19:10:22 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:36.151 19:10:22 -- setup/common.sh@33 -- # echo 1024 00:03:36.151 19:10:22 -- setup/common.sh@33 -- # return 0 00:03:36.151 19:10:22 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:36.151 19:10:22 -- setup/hugepages.sh@112 -- # get_nodes 00:03:36.151 19:10:22 -- setup/hugepages.sh@27 -- # local node 00:03:36.151 19:10:22 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:36.151 19:10:22 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:36.151 19:10:22 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:36.151 19:10:22 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:36.151 19:10:22 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:36.151 19:10:22 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:36.151 19:10:22 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:36.151 19:10:22 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:36.151 19:10:22 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:36.151 19:10:22 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:36.151 19:10:22 -- setup/common.sh@18 -- # local node=0 00:03:36.151 19:10:22 -- setup/common.sh@19 -- # local var val 00:03:36.151 19:10:22 -- setup/common.sh@20 -- # local mem_f mem 00:03:36.151 19:10:22 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:36.151 19:10:22 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:36.151 19:10:22 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:36.151 19:10:22 -- setup/common.sh@28 -- # mapfile -t mem 00:03:36.151 19:10:22 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.151 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 39758116 kB' 'MemUsed: 8358848 kB' 'SwapCached: 0 kB' 'Active: 3642004 kB' 'Inactive: 607904 kB' 'Active(anon): 3262368 kB' 'Inactive(anon): 0 kB' 'Active(file): 379636 kB' 'Inactive(file): 607904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3842892 kB' 'Mapped: 116184 kB' 'AnonPages: 410224 kB' 'Shmem: 2855352 kB' 'KernelStack: 10072 kB' 'PageTables: 5700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329892 kB' 'Slab: 568616 kB' 'SReclaimable: 329892 kB' 'SUnreclaim: 238724 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:22 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:22 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.152 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.152 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.152 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.152 19:10:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # continue 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:36.153 19:10:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:36.153 19:10:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:36.153 19:10:23 -- setup/common.sh@33 -- # echo 0 00:03:36.153 19:10:23 -- setup/common.sh@33 -- # return 0 00:03:36.153 19:10:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:36.153 19:10:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:36.153 19:10:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:36.153 19:10:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:36.153 19:10:23 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:36.153 node0=1024 expecting 1024 00:03:36.153 19:10:23 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:36.153 19:10:23 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:36.153 19:10:23 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:36.153 19:10:23 -- setup/hugepages.sh@202 -- # setup output 00:03:36.153 19:10:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:36.153 19:10:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:39.442 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:39.442 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:39.442 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:41.344 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:41.344 19:10:28 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:41.344 19:10:28 -- setup/hugepages.sh@89 -- # local node 00:03:41.344 19:10:28 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:41.344 19:10:28 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:41.344 19:10:28 -- setup/hugepages.sh@92 -- # local surp 00:03:41.344 19:10:28 -- setup/hugepages.sh@93 -- # local resv 00:03:41.344 19:10:28 -- setup/hugepages.sh@94 -- # local anon 00:03:41.344 19:10:28 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:41.344 19:10:28 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:41.344 19:10:28 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:41.344 19:10:28 -- setup/common.sh@18 -- # local node= 00:03:41.344 19:10:28 -- setup/common.sh@19 -- # local var val 00:03:41.344 19:10:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.344 19:10:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.344 19:10:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.344 19:10:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.344 19:10:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.344 19:10:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73975652 kB' 'MemAvailable: 79001884 kB' 'Buffers: 20532 kB' 'Cached: 12922632 kB' 'SwapCached: 0 kB' 'Active: 8758068 kB' 'Inactive: 4745244 kB' 'Active(anon): 8127832 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563428 kB' 'Mapped: 177436 kB' 'Shmem: 7567684 kB' 'KReclaimable: 486936 kB' 'Slab: 893852 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406916 kB' 'KernelStack: 16032 kB' 'PageTables: 8344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9513992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209484 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.344 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.344 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.345 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.345 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:41.613 19:10:28 -- setup/common.sh@33 -- # echo 0 00:03:41.613 19:10:28 -- setup/common.sh@33 -- # return 0 00:03:41.613 19:10:28 -- setup/hugepages.sh@97 -- # anon=0 00:03:41.613 19:10:28 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:41.613 19:10:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.613 19:10:28 -- setup/common.sh@18 -- # local node= 00:03:41.613 19:10:28 -- setup/common.sh@19 -- # local var val 00:03:41.613 19:10:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.613 19:10:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.613 19:10:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.613 19:10:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.613 19:10:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.613 19:10:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73976864 kB' 'MemAvailable: 79003096 kB' 'Buffers: 20532 kB' 'Cached: 12922636 kB' 'SwapCached: 0 kB' 'Active: 8758568 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128332 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 563980 kB' 'Mapped: 177436 kB' 'Shmem: 7567688 kB' 'KReclaimable: 486936 kB' 'Slab: 893868 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406932 kB' 'KernelStack: 16048 kB' 'PageTables: 8404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9514136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209468 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.613 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.613 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.614 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.614 19:10:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.615 19:10:28 -- setup/common.sh@33 -- # echo 0 00:03:41.615 19:10:28 -- setup/common.sh@33 -- # return 0 00:03:41.615 19:10:28 -- setup/hugepages.sh@99 -- # surp=0 00:03:41.615 19:10:28 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:41.615 19:10:28 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:41.615 19:10:28 -- setup/common.sh@18 -- # local node= 00:03:41.615 19:10:28 -- setup/common.sh@19 -- # local var val 00:03:41.615 19:10:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.615 19:10:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.615 19:10:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.615 19:10:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.615 19:10:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.615 19:10:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73976964 kB' 'MemAvailable: 79003196 kB' 'Buffers: 20532 kB' 'Cached: 12922656 kB' 'SwapCached: 0 kB' 'Active: 8758708 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128472 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564108 kB' 'Mapped: 177436 kB' 'Shmem: 7567708 kB' 'KReclaimable: 486936 kB' 'Slab: 893868 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406932 kB' 'KernelStack: 16096 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9514524 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209484 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.615 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.615 19:10:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.616 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.616 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:41.617 19:10:28 -- setup/common.sh@33 -- # echo 0 00:03:41.617 19:10:28 -- setup/common.sh@33 -- # return 0 00:03:41.617 19:10:28 -- setup/hugepages.sh@100 -- # resv=0 00:03:41.617 19:10:28 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:41.617 nr_hugepages=1024 00:03:41.617 19:10:28 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:41.617 resv_hugepages=0 00:03:41.617 19:10:28 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:41.617 surplus_hugepages=0 00:03:41.617 19:10:28 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:41.617 anon_hugepages=0 00:03:41.617 19:10:28 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.617 19:10:28 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:41.617 19:10:28 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:41.617 19:10:28 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:41.617 19:10:28 -- setup/common.sh@18 -- # local node= 00:03:41.617 19:10:28 -- setup/common.sh@19 -- # local var val 00:03:41.617 19:10:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.617 19:10:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.617 19:10:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:41.617 19:10:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:41.617 19:10:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.617 19:10:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293528 kB' 'MemFree: 73977252 kB' 'MemAvailable: 79003484 kB' 'Buffers: 20532 kB' 'Cached: 12922676 kB' 'SwapCached: 0 kB' 'Active: 8758772 kB' 'Inactive: 4745244 kB' 'Active(anon): 8128536 kB' 'Inactive(anon): 0 kB' 'Active(file): 630236 kB' 'Inactive(file): 4745244 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 564144 kB' 'Mapped: 177436 kB' 'Shmem: 7567728 kB' 'KReclaimable: 486936 kB' 'Slab: 893868 kB' 'SReclaimable: 486936 kB' 'SUnreclaim: 406932 kB' 'KernelStack: 16096 kB' 'PageTables: 8580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486792 kB' 'Committed_AS: 9514536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 209500 kB' 'VmallocChunk: 0 kB' 'Percpu: 64000 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 460220 kB' 'DirectMap2M: 8652800 kB' 'DirectMap1G: 92274688 kB' 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.617 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.617 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.618 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:41.618 19:10:28 -- setup/common.sh@33 -- # echo 1024 00:03:41.618 19:10:28 -- setup/common.sh@33 -- # return 0 00:03:41.618 19:10:28 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:41.618 19:10:28 -- setup/hugepages.sh@112 -- # get_nodes 00:03:41.618 19:10:28 -- setup/hugepages.sh@27 -- # local node 00:03:41.618 19:10:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.618 19:10:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:41.618 19:10:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:41.618 19:10:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:41.618 19:10:28 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:41.618 19:10:28 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:41.618 19:10:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:41.618 19:10:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:41.618 19:10:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:41.618 19:10:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:41.618 19:10:28 -- setup/common.sh@18 -- # local node=0 00:03:41.618 19:10:28 -- setup/common.sh@19 -- # local var val 00:03:41.618 19:10:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:41.618 19:10:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:41.618 19:10:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:41.618 19:10:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:41.618 19:10:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:41.618 19:10:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.618 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 39765936 kB' 'MemUsed: 8351028 kB' 'SwapCached: 0 kB' 'Active: 3641528 kB' 'Inactive: 607904 kB' 'Active(anon): 3261892 kB' 'Inactive(anon): 0 kB' 'Active(file): 379636 kB' 'Inactive(file): 607904 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3842916 kB' 'Mapped: 116388 kB' 'AnonPages: 409628 kB' 'Shmem: 2855376 kB' 'KernelStack: 10056 kB' 'PageTables: 5608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 329892 kB' 'Slab: 568604 kB' 'SReclaimable: 329892 kB' 'SUnreclaim: 238712 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # continue 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:41.619 19:10:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:41.619 19:10:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:41.619 19:10:28 -- setup/common.sh@33 -- # echo 0 00:03:41.619 19:10:28 -- setup/common.sh@33 -- # return 0 00:03:41.619 19:10:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:41.619 19:10:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:41.619 19:10:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:41.619 19:10:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:41.619 19:10:28 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:41.619 node0=1024 expecting 1024 00:03:41.619 19:10:28 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:41.619 00:03:41.619 real 0m10.879s 00:03:41.619 user 0m3.721s 00:03:41.619 sys 0m7.192s 00:03:41.619 19:10:28 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:41.619 19:10:28 -- common/autotest_common.sh@10 -- # set +x 00:03:41.619 ************************************ 00:03:41.619 END TEST no_shrink_alloc 00:03:41.619 ************************************ 00:03:41.619 19:10:28 -- setup/hugepages.sh@217 -- # clear_hp 00:03:41.619 19:10:28 -- setup/hugepages.sh@37 -- # local node hp 00:03:41.619 19:10:28 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:41.619 19:10:28 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:41.619 19:10:28 -- setup/hugepages.sh@41 -- # echo 0 00:03:41.620 19:10:28 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:41.620 19:10:28 -- setup/hugepages.sh@41 -- # echo 0 00:03:41.620 19:10:28 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:41.620 19:10:28 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:41.620 19:10:28 -- setup/hugepages.sh@41 -- # echo 0 00:03:41.620 19:10:28 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:41.620 19:10:28 -- setup/hugepages.sh@41 -- # echo 0 00:03:41.620 19:10:28 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:41.620 19:10:28 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:41.620 00:03:41.620 real 0m42.024s 00:03:41.620 user 0m13.453s 00:03:41.620 sys 0m25.248s 00:03:41.620 19:10:28 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:03:41.620 19:10:28 -- common/autotest_common.sh@10 -- # set +x 00:03:41.620 ************************************ 00:03:41.620 END TEST hugepages 00:03:41.620 ************************************ 00:03:41.620 19:10:28 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:41.620 19:10:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:41.620 19:10:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:41.620 19:10:28 -- common/autotest_common.sh@10 -- # set +x 00:03:41.881 ************************************ 00:03:41.881 START TEST driver 00:03:41.881 ************************************ 00:03:41.881 19:10:28 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:41.881 * Looking for test storage... 00:03:41.881 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:41.881 19:10:28 -- setup/driver.sh@68 -- # setup reset 00:03:41.881 19:10:28 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:41.881 19:10:28 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:48.534 19:10:35 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:48.534 19:10:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:48.534 19:10:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:48.534 19:10:35 -- common/autotest_common.sh@10 -- # set +x 00:03:48.534 ************************************ 00:03:48.534 START TEST guess_driver 00:03:48.534 ************************************ 00:03:48.534 19:10:35 -- common/autotest_common.sh@1111 -- # guess_driver 00:03:48.534 19:10:35 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:48.534 19:10:35 -- setup/driver.sh@47 -- # local fail=0 00:03:48.534 19:10:35 -- setup/driver.sh@49 -- # pick_driver 00:03:48.534 19:10:35 -- setup/driver.sh@36 -- # vfio 00:03:48.534 19:10:35 -- setup/driver.sh@21 -- # local iommu_grups 00:03:48.534 19:10:35 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:48.534 19:10:35 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:48.534 19:10:35 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:48.534 19:10:35 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:48.534 19:10:35 -- setup/driver.sh@29 -- # (( 190 > 0 )) 00:03:48.534 19:10:35 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:48.534 19:10:35 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:48.534 19:10:35 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:48.534 19:10:35 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:48.534 19:10:35 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:48.534 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:48.534 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:48.534 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:48.534 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:48.534 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:48.534 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:48.534 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:48.534 19:10:35 -- setup/driver.sh@30 -- # return 0 00:03:48.534 19:10:35 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:48.534 19:10:35 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:48.534 19:10:35 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:48.534 19:10:35 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:48.534 Looking for driver=vfio-pci 00:03:48.534 19:10:35 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:48.534 19:10:35 -- setup/driver.sh@45 -- # setup output config 00:03:48.534 19:10:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:48.534 19:10:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:51.876 19:10:38 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:51.876 19:10:38 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:51.876 19:10:38 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:55.162 19:10:41 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:55.162 19:10:41 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:55.162 19:10:41 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:57.064 19:10:43 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:57.064 19:10:43 -- setup/driver.sh@65 -- # setup reset 00:03:57.064 19:10:43 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:57.064 19:10:43 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:03.629 00:04:03.629 real 0m15.245s 00:04:03.629 user 0m3.742s 00:04:03.629 sys 0m7.657s 00:04:03.629 19:10:50 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:03.630 19:10:50 -- common/autotest_common.sh@10 -- # set +x 00:04:03.630 ************************************ 00:04:03.630 END TEST guess_driver 00:04:03.630 ************************************ 00:04:03.630 00:04:03.630 real 0m21.864s 00:04:03.630 user 0m5.596s 00:04:03.630 sys 0m11.561s 00:04:03.630 19:10:50 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:03.630 19:10:50 -- common/autotest_common.sh@10 -- # set +x 00:04:03.630 ************************************ 00:04:03.630 END TEST driver 00:04:03.630 ************************************ 00:04:03.888 19:10:50 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:03.888 19:10:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:03.888 19:10:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:03.888 19:10:50 -- common/autotest_common.sh@10 -- # set +x 00:04:03.888 ************************************ 00:04:03.888 START TEST devices 00:04:03.888 ************************************ 00:04:03.888 19:10:50 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:04.146 * Looking for test storage... 00:04:04.146 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:04.146 19:10:50 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:04.146 19:10:50 -- setup/devices.sh@192 -- # setup reset 00:04:04.147 19:10:50 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:04.147 19:10:50 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:10.712 19:10:56 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:10.712 19:10:56 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:10.712 19:10:56 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:10.712 19:10:56 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:10.712 19:10:56 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:10.712 19:10:56 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:10.712 19:10:56 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:10.712 19:10:56 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:10.712 19:10:56 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:10.712 19:10:56 -- setup/devices.sh@196 -- # blocks=() 00:04:10.712 19:10:56 -- setup/devices.sh@196 -- # declare -a blocks 00:04:10.712 19:10:56 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:10.712 19:10:56 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:10.712 19:10:56 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:10.712 19:10:56 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:10.712 19:10:56 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:10.712 19:10:56 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:10.712 19:10:56 -- setup/devices.sh@202 -- # pci=0000:1a:00.0 00:04:10.712 19:10:56 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:04:10.712 19:10:56 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:10.712 19:10:56 -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:10.712 19:10:56 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:10.712 No valid GPT data, bailing 00:04:10.712 19:10:56 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:10.712 19:10:56 -- scripts/common.sh@391 -- # pt= 00:04:10.712 19:10:56 -- scripts/common.sh@392 -- # return 1 00:04:10.712 19:10:56 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:10.712 19:10:56 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:10.712 19:10:56 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:10.712 19:10:56 -- setup/common.sh@80 -- # echo 4000787030016 00:04:10.712 19:10:56 -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:04:10.712 19:10:56 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:10.712 19:10:56 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:1a:00.0 00:04:10.712 19:10:56 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:10.712 19:10:56 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:10.712 19:10:56 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:10.712 19:10:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:10.712 19:10:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:10.712 19:10:56 -- common/autotest_common.sh@10 -- # set +x 00:04:10.712 ************************************ 00:04:10.712 START TEST nvme_mount 00:04:10.712 ************************************ 00:04:10.712 19:10:56 -- common/autotest_common.sh@1111 -- # nvme_mount 00:04:10.712 19:10:56 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:10.712 19:10:56 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:10.712 19:10:56 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:10.712 19:10:56 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:10.712 19:10:56 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:10.712 19:10:56 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:10.712 19:10:56 -- setup/common.sh@40 -- # local part_no=1 00:04:10.712 19:10:56 -- setup/common.sh@41 -- # local size=1073741824 00:04:10.712 19:10:56 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:10.712 19:10:56 -- setup/common.sh@44 -- # parts=() 00:04:10.712 19:10:56 -- setup/common.sh@44 -- # local parts 00:04:10.712 19:10:56 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:10.712 19:10:56 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:10.712 19:10:56 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:10.712 19:10:56 -- setup/common.sh@46 -- # (( part++ )) 00:04:10.712 19:10:56 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:10.712 19:10:56 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:10.712 19:10:56 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:10.712 19:10:56 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:10.972 Creating new GPT entries in memory. 00:04:10.972 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:10.972 other utilities. 00:04:10.972 19:10:57 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:10.972 19:10:57 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:10.972 19:10:57 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:10.972 19:10:57 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:10.972 19:10:57 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:11.908 Creating new GPT entries in memory. 00:04:11.908 The operation has completed successfully. 00:04:11.908 19:10:58 -- setup/common.sh@57 -- # (( part++ )) 00:04:11.908 19:10:58 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:11.908 19:10:58 -- setup/common.sh@62 -- # wait 1584363 00:04:11.908 19:10:58 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.908 19:10:58 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:11.908 19:10:58 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:11.908 19:10:58 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:11.908 19:10:58 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:11.908 19:10:58 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:12.167 19:10:58 -- setup/devices.sh@105 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:12.167 19:10:58 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:12.167 19:10:58 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:12.167 19:10:58 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:12.167 19:10:58 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:12.167 19:10:58 -- setup/devices.sh@53 -- # local found=0 00:04:12.167 19:10:58 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:12.167 19:10:58 -- setup/devices.sh@56 -- # : 00:04:12.167 19:10:58 -- setup/devices.sh@59 -- # local pci status 00:04:12.167 19:10:58 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:12.167 19:10:58 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:12.167 19:10:58 -- setup/devices.sh@47 -- # setup output config 00:04:12.167 19:10:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:12.167 19:10:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:15.467 19:11:01 -- setup/devices.sh@63 -- # found=1 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:15.467 19:11:01 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:15.467 19:11:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.371 19:11:03 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:17.371 19:11:03 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:17.371 19:11:03 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.371 19:11:03 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:17.371 19:11:03 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.371 19:11:03 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:17.371 19:11:03 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.371 19:11:03 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.371 19:11:03 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:17.371 19:11:03 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:17.371 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:17.371 19:11:03 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:17.371 19:11:03 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:17.371 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:17.371 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:17.371 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:17.371 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:17.371 19:11:04 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:17.371 19:11:04 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:17.371 19:11:04 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.371 19:11:04 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:17.371 19:11:04 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:17.371 19:11:04 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.371 19:11:04 -- setup/devices.sh@116 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.371 19:11:04 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:17.371 19:11:04 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:17.371 19:11:04 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.371 19:11:04 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.371 19:11:04 -- setup/devices.sh@53 -- # local found=0 00:04:17.371 19:11:04 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:17.371 19:11:04 -- setup/devices.sh@56 -- # : 00:04:17.371 19:11:04 -- setup/devices.sh@59 -- # local pci status 00:04:17.371 19:11:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:17.371 19:11:04 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:17.371 19:11:04 -- setup/devices.sh@47 -- # setup output config 00:04:17.371 19:11:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.371 19:11:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:20.657 19:11:07 -- setup/devices.sh@63 -- # found=1 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:20.657 19:11:07 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:20.657 19:11:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.562 19:11:09 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:22.562 19:11:09 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:22.562 19:11:09 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.562 19:11:09 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:22.562 19:11:09 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.562 19:11:09 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.562 19:11:09 -- setup/devices.sh@125 -- # verify 0000:1a:00.0 data@nvme0n1 '' '' 00:04:22.562 19:11:09 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:22.562 19:11:09 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:22.562 19:11:09 -- setup/devices.sh@50 -- # local mount_point= 00:04:22.562 19:11:09 -- setup/devices.sh@51 -- # local test_file= 00:04:22.562 19:11:09 -- setup/devices.sh@53 -- # local found=0 00:04:22.562 19:11:09 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:22.562 19:11:09 -- setup/devices.sh@59 -- # local pci status 00:04:22.562 19:11:09 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.562 19:11:09 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:22.562 19:11:09 -- setup/devices.sh@47 -- # setup output config 00:04:22.562 19:11:09 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.562 19:11:09 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:25.843 19:11:12 -- setup/devices.sh@63 -- # found=1 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.843 19:11:12 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:25.843 19:11:12 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:27.746 19:11:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:27.746 19:11:14 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:27.746 19:11:14 -- setup/devices.sh@68 -- # return 0 00:04:27.746 19:11:14 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:27.746 19:11:14 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:27.746 19:11:14 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:27.746 19:11:14 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:27.746 19:11:14 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:27.746 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:27.746 00:04:27.746 real 0m17.477s 00:04:27.746 user 0m4.888s 00:04:27.746 sys 0m10.279s 00:04:27.746 19:11:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:27.746 19:11:14 -- common/autotest_common.sh@10 -- # set +x 00:04:27.746 ************************************ 00:04:27.746 END TEST nvme_mount 00:04:27.746 ************************************ 00:04:27.746 19:11:14 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:27.746 19:11:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:27.746 19:11:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:27.746 19:11:14 -- common/autotest_common.sh@10 -- # set +x 00:04:27.746 ************************************ 00:04:27.746 START TEST dm_mount 00:04:27.746 ************************************ 00:04:27.746 19:11:14 -- common/autotest_common.sh@1111 -- # dm_mount 00:04:27.746 19:11:14 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:27.746 19:11:14 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:27.746 19:11:14 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:27.746 19:11:14 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:27.746 19:11:14 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:27.746 19:11:14 -- setup/common.sh@40 -- # local part_no=2 00:04:27.746 19:11:14 -- setup/common.sh@41 -- # local size=1073741824 00:04:27.746 19:11:14 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:27.746 19:11:14 -- setup/common.sh@44 -- # parts=() 00:04:27.746 19:11:14 -- setup/common.sh@44 -- # local parts 00:04:27.746 19:11:14 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:27.746 19:11:14 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:27.746 19:11:14 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:27.746 19:11:14 -- setup/common.sh@46 -- # (( part++ )) 00:04:27.746 19:11:14 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:27.746 19:11:14 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:27.747 19:11:14 -- setup/common.sh@46 -- # (( part++ )) 00:04:27.747 19:11:14 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:27.747 19:11:14 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:27.747 19:11:14 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:27.747 19:11:14 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:28.681 Creating new GPT entries in memory. 00:04:28.681 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:28.681 other utilities. 00:04:28.681 19:11:15 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:28.681 19:11:15 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:28.681 19:11:15 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:28.682 19:11:15 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:28.682 19:11:15 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:29.616 Creating new GPT entries in memory. 00:04:29.617 The operation has completed successfully. 00:04:29.617 19:11:16 -- setup/common.sh@57 -- # (( part++ )) 00:04:29.617 19:11:16 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:29.617 19:11:16 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:29.617 19:11:16 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:29.617 19:11:16 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:30.550 The operation has completed successfully. 00:04:30.550 19:11:17 -- setup/common.sh@57 -- # (( part++ )) 00:04:30.550 19:11:17 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:30.550 19:11:17 -- setup/common.sh@62 -- # wait 1589084 00:04:30.809 19:11:17 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:30.809 19:11:17 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.809 19:11:17 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:30.809 19:11:17 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:30.809 19:11:17 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:30.809 19:11:17 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:30.809 19:11:17 -- setup/devices.sh@161 -- # break 00:04:30.809 19:11:17 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:30.809 19:11:17 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:30.809 19:11:17 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:30.809 19:11:17 -- setup/devices.sh@166 -- # dm=dm-0 00:04:30.809 19:11:17 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:30.809 19:11:17 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:30.809 19:11:17 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.809 19:11:17 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:30.809 19:11:17 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.809 19:11:17 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:30.809 19:11:17 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:30.809 19:11:17 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.809 19:11:17 -- setup/devices.sh@174 -- # verify 0000:1a:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:30.809 19:11:17 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:30.809 19:11:17 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:30.809 19:11:17 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:30.809 19:11:17 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:30.809 19:11:17 -- setup/devices.sh@53 -- # local found=0 00:04:30.810 19:11:17 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:30.810 19:11:17 -- setup/devices.sh@56 -- # : 00:04:30.810 19:11:17 -- setup/devices.sh@59 -- # local pci status 00:04:30.810 19:11:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:30.810 19:11:17 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:30.810 19:11:17 -- setup/devices.sh@47 -- # setup output config 00:04:30.810 19:11:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.810 19:11:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:34.093 19:11:21 -- setup/devices.sh@63 -- # found=1 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.093 19:11:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:34.093 19:11:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.097 19:11:23 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:36.097 19:11:23 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:36.097 19:11:23 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.097 19:11:23 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:36.097 19:11:23 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:36.097 19:11:23 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:36.354 19:11:23 -- setup/devices.sh@184 -- # verify 0000:1a:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:36.354 19:11:23 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:36.354 19:11:23 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:36.354 19:11:23 -- setup/devices.sh@50 -- # local mount_point= 00:04:36.354 19:11:23 -- setup/devices.sh@51 -- # local test_file= 00:04:36.354 19:11:23 -- setup/devices.sh@53 -- # local found=0 00:04:36.354 19:11:23 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:36.354 19:11:23 -- setup/devices.sh@59 -- # local pci status 00:04:36.354 19:11:23 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.354 19:11:23 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:36.354 19:11:23 -- setup/devices.sh@47 -- # setup output config 00:04:36.354 19:11:23 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.354 19:11:23 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:39.641 19:11:26 -- setup/devices.sh@63 -- # found=1 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.641 19:11:26 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:39.641 19:11:26 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:41.548 19:11:28 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:41.548 19:11:28 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:41.548 19:11:28 -- setup/devices.sh@68 -- # return 0 00:04:41.548 19:11:28 -- setup/devices.sh@187 -- # cleanup_dm 00:04:41.548 19:11:28 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:41.548 19:11:28 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:41.548 19:11:28 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:41.807 19:11:28 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:41.807 19:11:28 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:41.807 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:41.807 19:11:28 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:41.807 19:11:28 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:41.807 00:04:41.807 real 0m14.148s 00:04:41.807 user 0m3.862s 00:04:41.807 sys 0m7.281s 00:04:41.807 19:11:28 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:41.807 19:11:28 -- common/autotest_common.sh@10 -- # set +x 00:04:41.807 ************************************ 00:04:41.807 END TEST dm_mount 00:04:41.807 ************************************ 00:04:41.807 19:11:28 -- setup/devices.sh@1 -- # cleanup 00:04:41.807 19:11:28 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:41.807 19:11:28 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.807 19:11:28 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:41.807 19:11:28 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:41.807 19:11:28 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:41.807 19:11:28 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:42.065 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:42.065 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:42.065 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:42.065 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:42.065 19:11:28 -- setup/devices.sh@12 -- # cleanup_dm 00:04:42.065 19:11:28 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:42.065 19:11:28 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:42.065 19:11:28 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:42.065 19:11:28 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:42.065 19:11:28 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:42.065 19:11:28 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:42.065 00:04:42.065 real 0m38.142s 00:04:42.065 user 0m10.813s 00:04:42.065 sys 0m21.715s 00:04:42.065 19:11:28 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:42.065 19:11:28 -- common/autotest_common.sh@10 -- # set +x 00:04:42.065 ************************************ 00:04:42.065 END TEST devices 00:04:42.065 ************************************ 00:04:42.065 00:04:42.065 real 2m16.822s 00:04:42.065 user 0m39.800s 00:04:42.065 sys 1m19.183s 00:04:42.065 19:11:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:04:42.065 19:11:29 -- common/autotest_common.sh@10 -- # set +x 00:04:42.065 ************************************ 00:04:42.065 END TEST setup.sh 00:04:42.065 ************************************ 00:04:42.065 19:11:29 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:45.350 Hugepages 00:04:45.350 node hugesize free / total 00:04:45.350 node0 1048576kB 0 / 0 00:04:45.350 node0 2048kB 2048 / 2048 00:04:45.350 node1 1048576kB 0 / 0 00:04:45.350 node1 2048kB 0 / 0 00:04:45.350 00:04:45.350 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:45.350 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:45.350 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:45.350 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:45.350 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:45.350 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:45.350 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:45.350 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:45.350 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:45.350 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:04:45.350 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:45.350 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:45.350 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:45.350 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:45.350 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:45.350 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:45.350 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:45.350 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:45.350 19:11:32 -- spdk/autotest.sh@130 -- # uname -s 00:04:45.350 19:11:32 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:45.350 19:11:32 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:45.350 19:11:32 -- common/autotest_common.sh@1517 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:48.658 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:48.658 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:51.958 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:04:53.862 19:11:40 -- common/autotest_common.sh@1518 -- # sleep 1 00:04:54.799 19:11:41 -- common/autotest_common.sh@1519 -- # bdfs=() 00:04:54.799 19:11:41 -- common/autotest_common.sh@1519 -- # local bdfs 00:04:54.799 19:11:41 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:04:54.799 19:11:41 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:04:54.799 19:11:41 -- common/autotest_common.sh@1499 -- # bdfs=() 00:04:54.799 19:11:41 -- common/autotest_common.sh@1499 -- # local bdfs 00:04:54.799 19:11:41 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:54.799 19:11:41 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:54.799 19:11:41 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:04:54.799 19:11:41 -- common/autotest_common.sh@1501 -- # (( 1 == 0 )) 00:04:54.799 19:11:41 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:1a:00.0 00:04:54.799 19:11:41 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:57.326 Waiting for block devices as requested 00:04:57.583 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:04:57.583 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:57.842 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:57.842 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:57.842 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:57.842 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:58.101 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:58.101 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:58.101 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:58.359 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:58.359 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:58.359 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:58.617 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:58.617 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:58.617 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:58.876 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:58.876 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:00.781 19:11:47 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:05:00.781 19:11:47 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:1a:00.0 00:05:00.781 19:11:47 -- common/autotest_common.sh@1488 -- # readlink -f /sys/class/nvme/nvme0 00:05:00.781 19:11:47 -- common/autotest_common.sh@1488 -- # grep 0000:1a:00.0/nvme/nvme 00:05:00.781 19:11:47 -- common/autotest_common.sh@1488 -- # bdf_sysfs_path=/sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:05:00.781 19:11:47 -- common/autotest_common.sh@1489 -- # [[ -z /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 ]] 00:05:00.781 19:11:47 -- common/autotest_common.sh@1493 -- # basename /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:05:00.781 19:11:47 -- common/autotest_common.sh@1493 -- # printf '%s\n' nvme0 00:05:00.781 19:11:47 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:05:00.781 19:11:47 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:05:00.781 19:11:47 -- common/autotest_common.sh@1531 -- # grep oacs 00:05:00.781 19:11:47 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:05:00.781 19:11:47 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:05:00.781 19:11:47 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:05:00.781 19:11:47 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:05:00.781 19:11:47 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:05:00.781 19:11:47 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:00.782 19:11:47 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:00.782 19:11:47 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:05:00.782 19:11:47 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:05:00.782 19:11:47 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:05:00.782 19:11:47 -- common/autotest_common.sh@1543 -- # continue 00:05:00.782 19:11:47 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:00.782 19:11:47 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:00.782 19:11:47 -- common/autotest_common.sh@10 -- # set +x 00:05:00.782 19:11:47 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:00.782 19:11:47 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:00.782 19:11:47 -- common/autotest_common.sh@10 -- # set +x 00:05:00.782 19:11:47 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:04.064 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:04.064 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:04.323 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:04.323 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:04.323 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:04.323 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:07.611 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:05:09.518 19:11:56 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:09.518 19:11:56 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:09.518 19:11:56 -- common/autotest_common.sh@10 -- # set +x 00:05:09.518 19:11:56 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:09.518 19:11:56 -- common/autotest_common.sh@1577 -- # mapfile -t bdfs 00:05:09.518 19:11:56 -- common/autotest_common.sh@1577 -- # get_nvme_bdfs_by_id 0x0a54 00:05:09.518 19:11:56 -- common/autotest_common.sh@1563 -- # bdfs=() 00:05:09.518 19:11:56 -- common/autotest_common.sh@1563 -- # local bdfs 00:05:09.518 19:11:56 -- common/autotest_common.sh@1565 -- # get_nvme_bdfs 00:05:09.518 19:11:56 -- common/autotest_common.sh@1499 -- # bdfs=() 00:05:09.518 19:11:56 -- common/autotest_common.sh@1499 -- # local bdfs 00:05:09.518 19:11:56 -- common/autotest_common.sh@1500 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:09.518 19:11:56 -- common/autotest_common.sh@1500 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:09.518 19:11:56 -- common/autotest_common.sh@1500 -- # jq -r '.config[].params.traddr' 00:05:09.518 19:11:56 -- common/autotest_common.sh@1501 -- # (( 1 == 0 )) 00:05:09.518 19:11:56 -- common/autotest_common.sh@1505 -- # printf '%s\n' 0000:1a:00.0 00:05:09.518 19:11:56 -- common/autotest_common.sh@1565 -- # for bdf in $(get_nvme_bdfs) 00:05:09.518 19:11:56 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:1a:00.0/device 00:05:09.518 19:11:56 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:05:09.518 19:11:56 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:09.518 19:11:56 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:05:09.518 19:11:56 -- common/autotest_common.sh@1572 -- # printf '%s\n' 0000:1a:00.0 00:05:09.518 19:11:56 -- common/autotest_common.sh@1578 -- # [[ -z 0000:1a:00.0 ]] 00:05:09.518 19:11:56 -- common/autotest_common.sh@1583 -- # spdk_tgt_pid=1599044 00:05:09.518 19:11:56 -- common/autotest_common.sh@1582 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:09.518 19:11:56 -- common/autotest_common.sh@1584 -- # waitforlisten 1599044 00:05:09.518 19:11:56 -- common/autotest_common.sh@817 -- # '[' -z 1599044 ']' 00:05:09.518 19:11:56 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:09.518 19:11:56 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:09.518 19:11:56 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:09.518 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:09.518 19:11:56 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:09.518 19:11:56 -- common/autotest_common.sh@10 -- # set +x 00:05:09.518 [2024-04-24 19:11:56.359590] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:09.518 [2024-04-24 19:11:56.359642] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1599044 ] 00:05:09.518 EAL: No free 2048 kB hugepages reported on node 1 00:05:09.519 [2024-04-24 19:11:56.432093] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:09.519 [2024-04-24 19:11:56.524511] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:10.456 19:11:57 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:10.456 19:11:57 -- common/autotest_common.sh@850 -- # return 0 00:05:10.456 19:11:57 -- common/autotest_common.sh@1586 -- # bdf_id=0 00:05:10.457 19:11:57 -- common/autotest_common.sh@1587 -- # for bdf in "${bdfs[@]}" 00:05:10.457 19:11:57 -- common/autotest_common.sh@1588 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:1a:00.0 00:05:13.743 nvme0n1 00:05:13.743 19:12:00 -- common/autotest_common.sh@1590 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:13.743 [2024-04-24 19:12:00.331468] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:13.743 request: 00:05:13.743 { 00:05:13.743 "nvme_ctrlr_name": "nvme0", 00:05:13.743 "password": "test", 00:05:13.743 "method": "bdev_nvme_opal_revert", 00:05:13.743 "req_id": 1 00:05:13.743 } 00:05:13.743 Got JSON-RPC error response 00:05:13.743 response: 00:05:13.743 { 00:05:13.743 "code": -32602, 00:05:13.743 "message": "Invalid parameters" 00:05:13.743 } 00:05:13.743 19:12:00 -- common/autotest_common.sh@1590 -- # true 00:05:13.743 19:12:00 -- common/autotest_common.sh@1591 -- # (( ++bdf_id )) 00:05:13.743 19:12:00 -- common/autotest_common.sh@1594 -- # killprocess 1599044 00:05:13.743 19:12:00 -- common/autotest_common.sh@936 -- # '[' -z 1599044 ']' 00:05:13.743 19:12:00 -- common/autotest_common.sh@940 -- # kill -0 1599044 00:05:13.743 19:12:00 -- common/autotest_common.sh@941 -- # uname 00:05:13.743 19:12:00 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:13.743 19:12:00 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1599044 00:05:13.744 19:12:00 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:13.744 19:12:00 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:13.744 19:12:00 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1599044' 00:05:13.744 killing process with pid 1599044 00:05:13.744 19:12:00 -- common/autotest_common.sh@955 -- # kill 1599044 00:05:13.744 19:12:00 -- common/autotest_common.sh@960 -- # wait 1599044 00:05:18.036 19:12:04 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:18.036 19:12:04 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:18.036 19:12:04 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:18.036 19:12:04 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:18.036 19:12:04 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:18.036 19:12:04 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:18.036 19:12:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.036 19:12:04 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:18.036 19:12:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:18.036 19:12:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:18.036 19:12:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.036 ************************************ 00:05:18.036 START TEST env 00:05:18.036 ************************************ 00:05:18.036 19:12:04 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:18.036 * Looking for test storage... 00:05:18.036 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:18.036 19:12:04 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:18.036 19:12:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:18.036 19:12:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:18.036 19:12:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.036 ************************************ 00:05:18.036 START TEST env_memory 00:05:18.036 ************************************ 00:05:18.036 19:12:04 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:18.036 00:05:18.036 00:05:18.036 CUnit - A unit testing framework for C - Version 2.1-3 00:05:18.036 http://cunit.sourceforge.net/ 00:05:18.036 00:05:18.036 00:05:18.036 Suite: memory 00:05:18.036 Test: alloc and free memory map ...[2024-04-24 19:12:04.754315] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:18.036 passed 00:05:18.036 Test: mem map translation ...[2024-04-24 19:12:04.767438] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:18.036 [2024-04-24 19:12:04.767454] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:18.036 [2024-04-24 19:12:04.767482] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:18.036 [2024-04-24 19:12:04.767491] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:18.036 passed 00:05:18.036 Test: mem map registration ...[2024-04-24 19:12:04.788992] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:18.036 [2024-04-24 19:12:04.789009] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:18.036 passed 00:05:18.036 Test: mem map adjacent registrations ...passed 00:05:18.036 00:05:18.036 Run Summary: Type Total Ran Passed Failed Inactive 00:05:18.036 suites 1 1 n/a 0 0 00:05:18.036 tests 4 4 4 0 0 00:05:18.036 asserts 152 152 152 0 n/a 00:05:18.036 00:05:18.036 Elapsed time = 0.086 seconds 00:05:18.036 00:05:18.036 real 0m0.098s 00:05:18.036 user 0m0.085s 00:05:18.036 sys 0m0.013s 00:05:18.036 19:12:04 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:18.036 19:12:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.036 ************************************ 00:05:18.036 END TEST env_memory 00:05:18.036 ************************************ 00:05:18.036 19:12:04 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:18.036 19:12:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:18.036 19:12:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:18.036 19:12:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.036 ************************************ 00:05:18.036 START TEST env_vtophys 00:05:18.036 ************************************ 00:05:18.036 19:12:04 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:18.036 EAL: lib.eal log level changed from notice to debug 00:05:18.037 EAL: Detected lcore 0 as core 0 on socket 0 00:05:18.037 EAL: Detected lcore 1 as core 1 on socket 0 00:05:18.037 EAL: Detected lcore 2 as core 2 on socket 0 00:05:18.037 EAL: Detected lcore 3 as core 3 on socket 0 00:05:18.037 EAL: Detected lcore 4 as core 4 on socket 0 00:05:18.037 EAL: Detected lcore 5 as core 8 on socket 0 00:05:18.037 EAL: Detected lcore 6 as core 9 on socket 0 00:05:18.037 EAL: Detected lcore 7 as core 10 on socket 0 00:05:18.037 EAL: Detected lcore 8 as core 11 on socket 0 00:05:18.037 EAL: Detected lcore 9 as core 16 on socket 0 00:05:18.037 EAL: Detected lcore 10 as core 17 on socket 0 00:05:18.037 EAL: Detected lcore 11 as core 18 on socket 0 00:05:18.037 EAL: Detected lcore 12 as core 19 on socket 0 00:05:18.037 EAL: Detected lcore 13 as core 20 on socket 0 00:05:18.037 EAL: Detected lcore 14 as core 24 on socket 0 00:05:18.037 EAL: Detected lcore 15 as core 25 on socket 0 00:05:18.037 EAL: Detected lcore 16 as core 26 on socket 0 00:05:18.037 EAL: Detected lcore 17 as core 27 on socket 0 00:05:18.037 EAL: Detected lcore 18 as core 0 on socket 1 00:05:18.037 EAL: Detected lcore 19 as core 1 on socket 1 00:05:18.037 EAL: Detected lcore 20 as core 2 on socket 1 00:05:18.037 EAL: Detected lcore 21 as core 3 on socket 1 00:05:18.037 EAL: Detected lcore 22 as core 4 on socket 1 00:05:18.037 EAL: Detected lcore 23 as core 8 on socket 1 00:05:18.037 EAL: Detected lcore 24 as core 9 on socket 1 00:05:18.037 EAL: Detected lcore 25 as core 10 on socket 1 00:05:18.037 EAL: Detected lcore 26 as core 11 on socket 1 00:05:18.037 EAL: Detected lcore 27 as core 16 on socket 1 00:05:18.037 EAL: Detected lcore 28 as core 17 on socket 1 00:05:18.037 EAL: Detected lcore 29 as core 18 on socket 1 00:05:18.037 EAL: Detected lcore 30 as core 19 on socket 1 00:05:18.037 EAL: Detected lcore 31 as core 20 on socket 1 00:05:18.037 EAL: Detected lcore 32 as core 24 on socket 1 00:05:18.037 EAL: Detected lcore 33 as core 25 on socket 1 00:05:18.037 EAL: Detected lcore 34 as core 26 on socket 1 00:05:18.037 EAL: Detected lcore 35 as core 27 on socket 1 00:05:18.037 EAL: Detected lcore 36 as core 0 on socket 0 00:05:18.037 EAL: Detected lcore 37 as core 1 on socket 0 00:05:18.037 EAL: Detected lcore 38 as core 2 on socket 0 00:05:18.037 EAL: Detected lcore 39 as core 3 on socket 0 00:05:18.037 EAL: Detected lcore 40 as core 4 on socket 0 00:05:18.037 EAL: Detected lcore 41 as core 8 on socket 0 00:05:18.037 EAL: Detected lcore 42 as core 9 on socket 0 00:05:18.037 EAL: Detected lcore 43 as core 10 on socket 0 00:05:18.037 EAL: Detected lcore 44 as core 11 on socket 0 00:05:18.037 EAL: Detected lcore 45 as core 16 on socket 0 00:05:18.037 EAL: Detected lcore 46 as core 17 on socket 0 00:05:18.037 EAL: Detected lcore 47 as core 18 on socket 0 00:05:18.037 EAL: Detected lcore 48 as core 19 on socket 0 00:05:18.037 EAL: Detected lcore 49 as core 20 on socket 0 00:05:18.037 EAL: Detected lcore 50 as core 24 on socket 0 00:05:18.037 EAL: Detected lcore 51 as core 25 on socket 0 00:05:18.037 EAL: Detected lcore 52 as core 26 on socket 0 00:05:18.037 EAL: Detected lcore 53 as core 27 on socket 0 00:05:18.037 EAL: Detected lcore 54 as core 0 on socket 1 00:05:18.037 EAL: Detected lcore 55 as core 1 on socket 1 00:05:18.037 EAL: Detected lcore 56 as core 2 on socket 1 00:05:18.037 EAL: Detected lcore 57 as core 3 on socket 1 00:05:18.037 EAL: Detected lcore 58 as core 4 on socket 1 00:05:18.037 EAL: Detected lcore 59 as core 8 on socket 1 00:05:18.037 EAL: Detected lcore 60 as core 9 on socket 1 00:05:18.037 EAL: Detected lcore 61 as core 10 on socket 1 00:05:18.037 EAL: Detected lcore 62 as core 11 on socket 1 00:05:18.037 EAL: Detected lcore 63 as core 16 on socket 1 00:05:18.037 EAL: Detected lcore 64 as core 17 on socket 1 00:05:18.037 EAL: Detected lcore 65 as core 18 on socket 1 00:05:18.037 EAL: Detected lcore 66 as core 19 on socket 1 00:05:18.037 EAL: Detected lcore 67 as core 20 on socket 1 00:05:18.037 EAL: Detected lcore 68 as core 24 on socket 1 00:05:18.037 EAL: Detected lcore 69 as core 25 on socket 1 00:05:18.037 EAL: Detected lcore 70 as core 26 on socket 1 00:05:18.037 EAL: Detected lcore 71 as core 27 on socket 1 00:05:18.037 EAL: Maximum logical cores by configuration: 128 00:05:18.037 EAL: Detected CPU lcores: 72 00:05:18.037 EAL: Detected NUMA nodes: 2 00:05:18.037 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:18.037 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:18.037 EAL: Checking presence of .so 'librte_eal.so' 00:05:18.037 EAL: Detected static linkage of DPDK 00:05:18.037 EAL: No shared files mode enabled, IPC will be disabled 00:05:18.037 EAL: Bus pci wants IOVA as 'DC' 00:05:18.037 EAL: Buses did not request a specific IOVA mode. 00:05:18.037 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:18.037 EAL: Selected IOVA mode 'VA' 00:05:18.037 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.037 EAL: Probing VFIO support... 00:05:18.037 EAL: IOMMU type 1 (Type 1) is supported 00:05:18.037 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:18.037 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:18.037 EAL: VFIO support initialized 00:05:18.037 EAL: Ask a virtual area of 0x2e000 bytes 00:05:18.037 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:18.037 EAL: Setting up physically contiguous memory... 00:05:18.037 EAL: Setting maximum number of open files to 524288 00:05:18.037 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:18.037 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:18.037 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:18.037 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:18.037 EAL: Ask a virtual area of 0x61000 bytes 00:05:18.037 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:18.037 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:18.037 EAL: Ask a virtual area of 0x400000000 bytes 00:05:18.037 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:18.037 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:18.037 EAL: Hugepages will be freed exactly as allocated. 00:05:18.037 EAL: No shared files mode enabled, IPC is disabled 00:05:18.037 EAL: No shared files mode enabled, IPC is disabled 00:05:18.037 EAL: TSC frequency is ~2300000 KHz 00:05:18.037 EAL: Main lcore 0 is ready (tid=7fab8d7a6a00;cpuset=[0]) 00:05:18.037 EAL: Trying to obtain current memory policy. 00:05:18.037 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.037 EAL: Restoring previous memory policy: 0 00:05:18.037 EAL: request: mp_malloc_sync 00:05:18.037 EAL: No shared files mode enabled, IPC is disabled 00:05:18.037 EAL: Heap on socket 0 was expanded by 2MB 00:05:18.037 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Mem event callback 'spdk:(nil)' registered 00:05:18.296 00:05:18.296 00:05:18.296 CUnit - A unit testing framework for C - Version 2.1-3 00:05:18.296 http://cunit.sourceforge.net/ 00:05:18.296 00:05:18.296 00:05:18.296 Suite: components_suite 00:05:18.296 Test: vtophys_malloc_test ...passed 00:05:18.296 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:18.296 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.296 EAL: Restoring previous memory policy: 4 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was expanded by 4MB 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was shrunk by 4MB 00:05:18.296 EAL: Trying to obtain current memory policy. 00:05:18.296 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.296 EAL: Restoring previous memory policy: 4 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was expanded by 6MB 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was shrunk by 6MB 00:05:18.296 EAL: Trying to obtain current memory policy. 00:05:18.296 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.296 EAL: Restoring previous memory policy: 4 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was expanded by 10MB 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was shrunk by 10MB 00:05:18.296 EAL: Trying to obtain current memory policy. 00:05:18.296 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.296 EAL: Restoring previous memory policy: 4 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was expanded by 18MB 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was shrunk by 18MB 00:05:18.296 EAL: Trying to obtain current memory policy. 00:05:18.296 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.296 EAL: Restoring previous memory policy: 4 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.296 EAL: request: mp_malloc_sync 00:05:18.296 EAL: No shared files mode enabled, IPC is disabled 00:05:18.296 EAL: Heap on socket 0 was expanded by 34MB 00:05:18.296 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.297 EAL: request: mp_malloc_sync 00:05:18.297 EAL: No shared files mode enabled, IPC is disabled 00:05:18.297 EAL: Heap on socket 0 was shrunk by 34MB 00:05:18.297 EAL: Trying to obtain current memory policy. 00:05:18.297 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.297 EAL: Restoring previous memory policy: 4 00:05:18.297 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.297 EAL: request: mp_malloc_sync 00:05:18.297 EAL: No shared files mode enabled, IPC is disabled 00:05:18.297 EAL: Heap on socket 0 was expanded by 66MB 00:05:18.297 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.297 EAL: request: mp_malloc_sync 00:05:18.297 EAL: No shared files mode enabled, IPC is disabled 00:05:18.297 EAL: Heap on socket 0 was shrunk by 66MB 00:05:18.297 EAL: Trying to obtain current memory policy. 00:05:18.297 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.297 EAL: Restoring previous memory policy: 4 00:05:18.297 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.297 EAL: request: mp_malloc_sync 00:05:18.297 EAL: No shared files mode enabled, IPC is disabled 00:05:18.297 EAL: Heap on socket 0 was expanded by 130MB 00:05:18.297 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.297 EAL: request: mp_malloc_sync 00:05:18.297 EAL: No shared files mode enabled, IPC is disabled 00:05:18.297 EAL: Heap on socket 0 was shrunk by 130MB 00:05:18.297 EAL: Trying to obtain current memory policy. 00:05:18.297 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.297 EAL: Restoring previous memory policy: 4 00:05:18.297 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.297 EAL: request: mp_malloc_sync 00:05:18.297 EAL: No shared files mode enabled, IPC is disabled 00:05:18.297 EAL: Heap on socket 0 was expanded by 258MB 00:05:18.297 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.555 EAL: request: mp_malloc_sync 00:05:18.555 EAL: No shared files mode enabled, IPC is disabled 00:05:18.555 EAL: Heap on socket 0 was shrunk by 258MB 00:05:18.555 EAL: Trying to obtain current memory policy. 00:05:18.555 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:18.556 EAL: Restoring previous memory policy: 4 00:05:18.556 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.556 EAL: request: mp_malloc_sync 00:05:18.556 EAL: No shared files mode enabled, IPC is disabled 00:05:18.556 EAL: Heap on socket 0 was expanded by 514MB 00:05:18.556 EAL: Calling mem event callback 'spdk:(nil)' 00:05:18.814 EAL: request: mp_malloc_sync 00:05:18.814 EAL: No shared files mode enabled, IPC is disabled 00:05:18.814 EAL: Heap on socket 0 was shrunk by 514MB 00:05:18.814 EAL: Trying to obtain current memory policy. 00:05:18.814 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.073 EAL: Restoring previous memory policy: 4 00:05:19.073 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.073 EAL: request: mp_malloc_sync 00:05:19.073 EAL: No shared files mode enabled, IPC is disabled 00:05:19.073 EAL: Heap on socket 0 was expanded by 1026MB 00:05:19.073 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.331 EAL: request: mp_malloc_sync 00:05:19.331 EAL: No shared files mode enabled, IPC is disabled 00:05:19.331 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:19.331 passed 00:05:19.331 00:05:19.331 Run Summary: Type Total Ran Passed Failed Inactive 00:05:19.331 suites 1 1 n/a 0 0 00:05:19.331 tests 2 2 2 0 0 00:05:19.331 asserts 497 497 497 0 n/a 00:05:19.331 00:05:19.331 Elapsed time = 1.060 seconds 00:05:19.331 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.331 EAL: request: mp_malloc_sync 00:05:19.331 EAL: No shared files mode enabled, IPC is disabled 00:05:19.331 EAL: Heap on socket 0 was shrunk by 2MB 00:05:19.331 EAL: No shared files mode enabled, IPC is disabled 00:05:19.331 EAL: No shared files mode enabled, IPC is disabled 00:05:19.331 EAL: No shared files mode enabled, IPC is disabled 00:05:19.331 00:05:19.331 real 0m1.187s 00:05:19.331 user 0m0.679s 00:05:19.331 sys 0m0.476s 00:05:19.331 19:12:06 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:19.331 19:12:06 -- common/autotest_common.sh@10 -- # set +x 00:05:19.331 ************************************ 00:05:19.331 END TEST env_vtophys 00:05:19.331 ************************************ 00:05:19.331 19:12:06 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:19.331 19:12:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:19.331 19:12:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.331 19:12:06 -- common/autotest_common.sh@10 -- # set +x 00:05:19.589 ************************************ 00:05:19.589 START TEST env_pci 00:05:19.589 ************************************ 00:05:19.589 19:12:06 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:19.589 00:05:19.589 00:05:19.589 CUnit - A unit testing framework for C - Version 2.1-3 00:05:19.589 http://cunit.sourceforge.net/ 00:05:19.589 00:05:19.589 00:05:19.589 Suite: pci 00:05:19.589 Test: pci_hook ...[2024-04-24 19:12:06.371156] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1600927 has claimed it 00:05:19.589 EAL: Cannot find device (10000:00:01.0) 00:05:19.589 EAL: Failed to attach device on primary process 00:05:19.589 passed 00:05:19.589 00:05:19.589 Run Summary: Type Total Ran Passed Failed Inactive 00:05:19.589 suites 1 1 n/a 0 0 00:05:19.589 tests 1 1 1 0 0 00:05:19.589 asserts 25 25 25 0 n/a 00:05:19.589 00:05:19.589 Elapsed time = 0.036 seconds 00:05:19.589 00:05:19.589 real 0m0.054s 00:05:19.589 user 0m0.012s 00:05:19.589 sys 0m0.042s 00:05:19.589 19:12:06 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:19.589 19:12:06 -- common/autotest_common.sh@10 -- # set +x 00:05:19.589 ************************************ 00:05:19.589 END TEST env_pci 00:05:19.589 ************************************ 00:05:19.589 19:12:06 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:19.589 19:12:06 -- env/env.sh@15 -- # uname 00:05:19.589 19:12:06 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:19.589 19:12:06 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:19.590 19:12:06 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:19.590 19:12:06 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:19.590 19:12:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.590 19:12:06 -- common/autotest_common.sh@10 -- # set +x 00:05:19.590 ************************************ 00:05:19.590 START TEST env_dpdk_post_init 00:05:19.590 ************************************ 00:05:19.590 19:12:06 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:19.848 EAL: Detected CPU lcores: 72 00:05:19.848 EAL: Detected NUMA nodes: 2 00:05:19.848 EAL: Detected static linkage of DPDK 00:05:19.848 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:19.848 EAL: Selected IOVA mode 'VA' 00:05:19.848 EAL: No free 2048 kB hugepages reported on node 1 00:05:19.848 EAL: VFIO support initialized 00:05:19.848 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:19.848 EAL: Using IOMMU type 1 (Type 1) 00:05:20.785 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:1a:00.0 (socket 0) 00:05:26.051 EAL: Releasing PCI mapped resource for 0000:1a:00.0 00:05:26.051 EAL: Calling pci_unmap_resource for 0000:1a:00.0 at 0x202001000000 00:05:26.051 Starting DPDK initialization... 00:05:26.051 Starting SPDK post initialization... 00:05:26.051 SPDK NVMe probe 00:05:26.051 Attaching to 0000:1a:00.0 00:05:26.051 Attached to 0000:1a:00.0 00:05:26.051 Cleaning up... 00:05:26.051 00:05:26.051 real 0m6.459s 00:05:26.051 user 0m4.922s 00:05:26.051 sys 0m0.791s 00:05:26.051 19:12:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:26.051 19:12:13 -- common/autotest_common.sh@10 -- # set +x 00:05:26.051 ************************************ 00:05:26.051 END TEST env_dpdk_post_init 00:05:26.051 ************************************ 00:05:26.309 19:12:13 -- env/env.sh@26 -- # uname 00:05:26.309 19:12:13 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:26.309 19:12:13 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:26.309 19:12:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.309 19:12:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.309 19:12:13 -- common/autotest_common.sh@10 -- # set +x 00:05:26.309 ************************************ 00:05:26.309 START TEST env_mem_callbacks 00:05:26.309 ************************************ 00:05:26.309 19:12:13 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:26.309 EAL: Detected CPU lcores: 72 00:05:26.309 EAL: Detected NUMA nodes: 2 00:05:26.309 EAL: Detected static linkage of DPDK 00:05:26.309 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:26.309 EAL: Selected IOVA mode 'VA' 00:05:26.309 EAL: No free 2048 kB hugepages reported on node 1 00:05:26.309 EAL: VFIO support initialized 00:05:26.310 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:26.310 00:05:26.310 00:05:26.310 CUnit - A unit testing framework for C - Version 2.1-3 00:05:26.310 http://cunit.sourceforge.net/ 00:05:26.310 00:05:26.310 00:05:26.310 Suite: memory 00:05:26.310 Test: test ... 00:05:26.310 register 0x200000200000 2097152 00:05:26.310 malloc 3145728 00:05:26.310 register 0x200000400000 4194304 00:05:26.310 buf 0x200000500000 len 3145728 PASSED 00:05:26.310 malloc 64 00:05:26.310 buf 0x2000004fff40 len 64 PASSED 00:05:26.310 malloc 4194304 00:05:26.310 register 0x200000800000 6291456 00:05:26.310 buf 0x200000a00000 len 4194304 PASSED 00:05:26.310 free 0x200000500000 3145728 00:05:26.310 free 0x2000004fff40 64 00:05:26.310 unregister 0x200000400000 4194304 PASSED 00:05:26.310 free 0x200000a00000 4194304 00:05:26.310 unregister 0x200000800000 6291456 PASSED 00:05:26.310 malloc 8388608 00:05:26.310 register 0x200000400000 10485760 00:05:26.310 buf 0x200000600000 len 8388608 PASSED 00:05:26.310 free 0x200000600000 8388608 00:05:26.310 unregister 0x200000400000 10485760 PASSED 00:05:26.310 passed 00:05:26.310 00:05:26.310 Run Summary: Type Total Ran Passed Failed Inactive 00:05:26.310 suites 1 1 n/a 0 0 00:05:26.310 tests 1 1 1 0 0 00:05:26.310 asserts 15 15 15 0 n/a 00:05:26.310 00:05:26.310 Elapsed time = 0.005 seconds 00:05:26.310 00:05:26.310 real 0m0.068s 00:05:26.310 user 0m0.019s 00:05:26.310 sys 0m0.049s 00:05:26.310 19:12:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:26.310 19:12:13 -- common/autotest_common.sh@10 -- # set +x 00:05:26.310 ************************************ 00:05:26.310 END TEST env_mem_callbacks 00:05:26.310 ************************************ 00:05:26.567 00:05:26.567 real 0m8.891s 00:05:26.567 user 0m6.086s 00:05:26.567 sys 0m1.953s 00:05:26.567 19:12:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:26.567 19:12:13 -- common/autotest_common.sh@10 -- # set +x 00:05:26.567 ************************************ 00:05:26.567 END TEST env 00:05:26.567 ************************************ 00:05:26.567 19:12:13 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:26.567 19:12:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.567 19:12:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.567 19:12:13 -- common/autotest_common.sh@10 -- # set +x 00:05:26.567 ************************************ 00:05:26.567 START TEST rpc 00:05:26.567 ************************************ 00:05:26.567 19:12:13 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:26.826 * Looking for test storage... 00:05:26.826 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:26.826 19:12:13 -- rpc/rpc.sh@65 -- # spdk_pid=1602090 00:05:26.826 19:12:13 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:26.826 19:12:13 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:26.826 19:12:13 -- rpc/rpc.sh@67 -- # waitforlisten 1602090 00:05:26.826 19:12:13 -- common/autotest_common.sh@817 -- # '[' -z 1602090 ']' 00:05:26.826 19:12:13 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:26.826 19:12:13 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:26.826 19:12:13 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:26.826 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:26.826 19:12:13 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:26.826 19:12:13 -- common/autotest_common.sh@10 -- # set +x 00:05:26.826 [2024-04-24 19:12:13.685703] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:26.826 [2024-04-24 19:12:13.685796] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1602090 ] 00:05:26.826 EAL: No free 2048 kB hugepages reported on node 1 00:05:26.826 [2024-04-24 19:12:13.762513] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:27.084 [2024-04-24 19:12:13.855361] app.c: 523:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:27.084 [2024-04-24 19:12:13.855400] app.c: 527:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1602090' to capture a snapshot of events at runtime. 00:05:27.084 [2024-04-24 19:12:13.855410] app.c: 529:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:27.084 [2024-04-24 19:12:13.855419] app.c: 530:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:27.084 [2024-04-24 19:12:13.855427] app.c: 531:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1602090 for offline analysis/debug. 00:05:27.084 [2024-04-24 19:12:13.855450] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.650 19:12:14 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:27.650 19:12:14 -- common/autotest_common.sh@850 -- # return 0 00:05:27.650 19:12:14 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:27.650 19:12:14 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:27.650 19:12:14 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:27.650 19:12:14 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:27.650 19:12:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:27.650 19:12:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.650 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.650 ************************************ 00:05:27.650 START TEST rpc_integrity 00:05:27.650 ************************************ 00:05:27.650 19:12:14 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:05:27.650 19:12:14 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:27.650 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.650 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.650 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.650 19:12:14 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:27.650 19:12:14 -- rpc/rpc.sh@13 -- # jq length 00:05:27.908 19:12:14 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:27.908 19:12:14 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:27.908 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.908 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.908 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.908 19:12:14 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:27.908 19:12:14 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:27.908 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.908 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.908 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.908 19:12:14 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:27.908 { 00:05:27.908 "name": "Malloc0", 00:05:27.909 "aliases": [ 00:05:27.909 "889e5970-b6de-48cd-897c-f518c8f22d75" 00:05:27.909 ], 00:05:27.909 "product_name": "Malloc disk", 00:05:27.909 "block_size": 512, 00:05:27.909 "num_blocks": 16384, 00:05:27.909 "uuid": "889e5970-b6de-48cd-897c-f518c8f22d75", 00:05:27.909 "assigned_rate_limits": { 00:05:27.909 "rw_ios_per_sec": 0, 00:05:27.909 "rw_mbytes_per_sec": 0, 00:05:27.909 "r_mbytes_per_sec": 0, 00:05:27.909 "w_mbytes_per_sec": 0 00:05:27.909 }, 00:05:27.909 "claimed": false, 00:05:27.909 "zoned": false, 00:05:27.909 "supported_io_types": { 00:05:27.909 "read": true, 00:05:27.909 "write": true, 00:05:27.909 "unmap": true, 00:05:27.909 "write_zeroes": true, 00:05:27.909 "flush": true, 00:05:27.909 "reset": true, 00:05:27.909 "compare": false, 00:05:27.909 "compare_and_write": false, 00:05:27.909 "abort": true, 00:05:27.909 "nvme_admin": false, 00:05:27.909 "nvme_io": false 00:05:27.909 }, 00:05:27.909 "memory_domains": [ 00:05:27.909 { 00:05:27.909 "dma_device_id": "system", 00:05:27.909 "dma_device_type": 1 00:05:27.909 }, 00:05:27.909 { 00:05:27.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:27.909 "dma_device_type": 2 00:05:27.909 } 00:05:27.909 ], 00:05:27.909 "driver_specific": {} 00:05:27.909 } 00:05:27.909 ]' 00:05:27.909 19:12:14 -- rpc/rpc.sh@17 -- # jq length 00:05:27.909 19:12:14 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:27.909 19:12:14 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:27.909 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.909 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.909 [2024-04-24 19:12:14.776644] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:27.909 [2024-04-24 19:12:14.776682] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:27.909 [2024-04-24 19:12:14.776707] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5753dd0 00:05:27.909 [2024-04-24 19:12:14.776718] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:27.909 [2024-04-24 19:12:14.777623] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:27.909 [2024-04-24 19:12:14.777647] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:27.909 Passthru0 00:05:27.909 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.909 19:12:14 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:27.909 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.909 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.909 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.909 19:12:14 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:27.909 { 00:05:27.909 "name": "Malloc0", 00:05:27.909 "aliases": [ 00:05:27.909 "889e5970-b6de-48cd-897c-f518c8f22d75" 00:05:27.909 ], 00:05:27.909 "product_name": "Malloc disk", 00:05:27.909 "block_size": 512, 00:05:27.909 "num_blocks": 16384, 00:05:27.909 "uuid": "889e5970-b6de-48cd-897c-f518c8f22d75", 00:05:27.909 "assigned_rate_limits": { 00:05:27.909 "rw_ios_per_sec": 0, 00:05:27.909 "rw_mbytes_per_sec": 0, 00:05:27.909 "r_mbytes_per_sec": 0, 00:05:27.909 "w_mbytes_per_sec": 0 00:05:27.909 }, 00:05:27.909 "claimed": true, 00:05:27.909 "claim_type": "exclusive_write", 00:05:27.909 "zoned": false, 00:05:27.909 "supported_io_types": { 00:05:27.909 "read": true, 00:05:27.909 "write": true, 00:05:27.909 "unmap": true, 00:05:27.909 "write_zeroes": true, 00:05:27.909 "flush": true, 00:05:27.909 "reset": true, 00:05:27.909 "compare": false, 00:05:27.909 "compare_and_write": false, 00:05:27.909 "abort": true, 00:05:27.909 "nvme_admin": false, 00:05:27.909 "nvme_io": false 00:05:27.909 }, 00:05:27.909 "memory_domains": [ 00:05:27.909 { 00:05:27.909 "dma_device_id": "system", 00:05:27.909 "dma_device_type": 1 00:05:27.909 }, 00:05:27.909 { 00:05:27.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:27.909 "dma_device_type": 2 00:05:27.909 } 00:05:27.909 ], 00:05:27.909 "driver_specific": {} 00:05:27.909 }, 00:05:27.909 { 00:05:27.909 "name": "Passthru0", 00:05:27.909 "aliases": [ 00:05:27.909 "3be0e1fc-82ba-5d46-9d11-adb931b53111" 00:05:27.909 ], 00:05:27.909 "product_name": "passthru", 00:05:27.909 "block_size": 512, 00:05:27.909 "num_blocks": 16384, 00:05:27.909 "uuid": "3be0e1fc-82ba-5d46-9d11-adb931b53111", 00:05:27.909 "assigned_rate_limits": { 00:05:27.909 "rw_ios_per_sec": 0, 00:05:27.909 "rw_mbytes_per_sec": 0, 00:05:27.909 "r_mbytes_per_sec": 0, 00:05:27.909 "w_mbytes_per_sec": 0 00:05:27.909 }, 00:05:27.909 "claimed": false, 00:05:27.909 "zoned": false, 00:05:27.909 "supported_io_types": { 00:05:27.909 "read": true, 00:05:27.909 "write": true, 00:05:27.909 "unmap": true, 00:05:27.909 "write_zeroes": true, 00:05:27.909 "flush": true, 00:05:27.909 "reset": true, 00:05:27.909 "compare": false, 00:05:27.909 "compare_and_write": false, 00:05:27.909 "abort": true, 00:05:27.909 "nvme_admin": false, 00:05:27.909 "nvme_io": false 00:05:27.909 }, 00:05:27.909 "memory_domains": [ 00:05:27.909 { 00:05:27.909 "dma_device_id": "system", 00:05:27.909 "dma_device_type": 1 00:05:27.909 }, 00:05:27.909 { 00:05:27.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:27.909 "dma_device_type": 2 00:05:27.909 } 00:05:27.909 ], 00:05:27.909 "driver_specific": { 00:05:27.909 "passthru": { 00:05:27.909 "name": "Passthru0", 00:05:27.909 "base_bdev_name": "Malloc0" 00:05:27.909 } 00:05:27.909 } 00:05:27.909 } 00:05:27.909 ]' 00:05:27.909 19:12:14 -- rpc/rpc.sh@21 -- # jq length 00:05:27.909 19:12:14 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:27.909 19:12:14 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:27.909 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.909 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.909 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.909 19:12:14 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:27.909 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.909 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.909 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.909 19:12:14 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:27.909 19:12:14 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:27.909 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.909 19:12:14 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:27.909 19:12:14 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:27.909 19:12:14 -- rpc/rpc.sh@26 -- # jq length 00:05:27.909 19:12:14 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:27.909 00:05:27.909 real 0m0.270s 00:05:27.909 user 0m0.172s 00:05:27.909 sys 0m0.046s 00:05:27.909 19:12:14 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:27.909 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:27.909 ************************************ 00:05:27.909 END TEST rpc_integrity 00:05:27.909 ************************************ 00:05:28.168 19:12:14 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:28.168 19:12:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:28.168 19:12:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:28.168 19:12:14 -- common/autotest_common.sh@10 -- # set +x 00:05:28.168 ************************************ 00:05:28.168 START TEST rpc_plugins 00:05:28.168 ************************************ 00:05:28.168 19:12:15 -- common/autotest_common.sh@1111 -- # rpc_plugins 00:05:28.168 19:12:15 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:28.168 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.168 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.168 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.168 19:12:15 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:28.168 19:12:15 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:28.168 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.168 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.168 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.168 19:12:15 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:28.168 { 00:05:28.168 "name": "Malloc1", 00:05:28.168 "aliases": [ 00:05:28.168 "258e54f7-8890-4fc4-a089-159349948f2a" 00:05:28.168 ], 00:05:28.168 "product_name": "Malloc disk", 00:05:28.168 "block_size": 4096, 00:05:28.168 "num_blocks": 256, 00:05:28.168 "uuid": "258e54f7-8890-4fc4-a089-159349948f2a", 00:05:28.168 "assigned_rate_limits": { 00:05:28.168 "rw_ios_per_sec": 0, 00:05:28.168 "rw_mbytes_per_sec": 0, 00:05:28.168 "r_mbytes_per_sec": 0, 00:05:28.168 "w_mbytes_per_sec": 0 00:05:28.168 }, 00:05:28.168 "claimed": false, 00:05:28.168 "zoned": false, 00:05:28.168 "supported_io_types": { 00:05:28.168 "read": true, 00:05:28.168 "write": true, 00:05:28.168 "unmap": true, 00:05:28.168 "write_zeroes": true, 00:05:28.168 "flush": true, 00:05:28.168 "reset": true, 00:05:28.168 "compare": false, 00:05:28.168 "compare_and_write": false, 00:05:28.168 "abort": true, 00:05:28.168 "nvme_admin": false, 00:05:28.168 "nvme_io": false 00:05:28.168 }, 00:05:28.168 "memory_domains": [ 00:05:28.168 { 00:05:28.168 "dma_device_id": "system", 00:05:28.168 "dma_device_type": 1 00:05:28.168 }, 00:05:28.168 { 00:05:28.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:28.168 "dma_device_type": 2 00:05:28.168 } 00:05:28.168 ], 00:05:28.168 "driver_specific": {} 00:05:28.168 } 00:05:28.168 ]' 00:05:28.168 19:12:15 -- rpc/rpc.sh@32 -- # jq length 00:05:28.427 19:12:15 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:28.427 19:12:15 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:28.427 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.427 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.427 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.427 19:12:15 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:28.427 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.427 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.427 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.427 19:12:15 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:28.427 19:12:15 -- rpc/rpc.sh@36 -- # jq length 00:05:28.427 19:12:15 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:28.427 00:05:28.427 real 0m0.118s 00:05:28.427 user 0m0.072s 00:05:28.427 sys 0m0.023s 00:05:28.427 19:12:15 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:28.427 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.427 ************************************ 00:05:28.427 END TEST rpc_plugins 00:05:28.427 ************************************ 00:05:28.427 19:12:15 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:28.427 19:12:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:28.427 19:12:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:28.427 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.427 ************************************ 00:05:28.427 START TEST rpc_trace_cmd_test 00:05:28.427 ************************************ 00:05:28.427 19:12:15 -- common/autotest_common.sh@1111 -- # rpc_trace_cmd_test 00:05:28.427 19:12:15 -- rpc/rpc.sh@40 -- # local info 00:05:28.427 19:12:15 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:28.427 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.427 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.686 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.686 19:12:15 -- rpc/rpc.sh@42 -- # info='{ 00:05:28.686 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1602090", 00:05:28.686 "tpoint_group_mask": "0x8", 00:05:28.686 "iscsi_conn": { 00:05:28.686 "mask": "0x2", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "scsi": { 00:05:28.686 "mask": "0x4", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "bdev": { 00:05:28.686 "mask": "0x8", 00:05:28.686 "tpoint_mask": "0xffffffffffffffff" 00:05:28.686 }, 00:05:28.686 "nvmf_rdma": { 00:05:28.686 "mask": "0x10", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "nvmf_tcp": { 00:05:28.686 "mask": "0x20", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "ftl": { 00:05:28.686 "mask": "0x40", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "blobfs": { 00:05:28.686 "mask": "0x80", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "dsa": { 00:05:28.686 "mask": "0x200", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "thread": { 00:05:28.686 "mask": "0x400", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "nvme_pcie": { 00:05:28.686 "mask": "0x800", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "iaa": { 00:05:28.686 "mask": "0x1000", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "nvme_tcp": { 00:05:28.686 "mask": "0x2000", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "bdev_nvme": { 00:05:28.686 "mask": "0x4000", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 }, 00:05:28.686 "sock": { 00:05:28.686 "mask": "0x8000", 00:05:28.686 "tpoint_mask": "0x0" 00:05:28.686 } 00:05:28.686 }' 00:05:28.686 19:12:15 -- rpc/rpc.sh@43 -- # jq length 00:05:28.686 19:12:15 -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:28.686 19:12:15 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:28.686 19:12:15 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:28.686 19:12:15 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:28.686 19:12:15 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:28.686 19:12:15 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:28.686 19:12:15 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:28.686 19:12:15 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:28.686 19:12:15 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:28.686 00:05:28.686 real 0m0.201s 00:05:28.686 user 0m0.166s 00:05:28.686 sys 0m0.025s 00:05:28.686 19:12:15 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:28.686 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.686 ************************************ 00:05:28.686 END TEST rpc_trace_cmd_test 00:05:28.686 ************************************ 00:05:28.686 19:12:15 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:28.686 19:12:15 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:28.686 19:12:15 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:28.686 19:12:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:28.686 19:12:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:28.686 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.944 ************************************ 00:05:28.944 START TEST rpc_daemon_integrity 00:05:28.944 ************************************ 00:05:28.944 19:12:15 -- common/autotest_common.sh@1111 -- # rpc_integrity 00:05:28.945 19:12:15 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:28.945 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.945 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.945 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.945 19:12:15 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:28.945 19:12:15 -- rpc/rpc.sh@13 -- # jq length 00:05:28.945 19:12:15 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:28.945 19:12:15 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:28.945 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.945 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.945 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.945 19:12:15 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:28.945 19:12:15 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:28.945 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:28.945 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:28.945 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:28.945 19:12:15 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:28.945 { 00:05:28.945 "name": "Malloc2", 00:05:28.945 "aliases": [ 00:05:28.945 "5575d461-1baa-4a9e-8c7b-5392821dbdec" 00:05:28.945 ], 00:05:28.945 "product_name": "Malloc disk", 00:05:28.945 "block_size": 512, 00:05:28.945 "num_blocks": 16384, 00:05:28.945 "uuid": "5575d461-1baa-4a9e-8c7b-5392821dbdec", 00:05:28.945 "assigned_rate_limits": { 00:05:28.945 "rw_ios_per_sec": 0, 00:05:28.945 "rw_mbytes_per_sec": 0, 00:05:28.945 "r_mbytes_per_sec": 0, 00:05:28.945 "w_mbytes_per_sec": 0 00:05:28.945 }, 00:05:28.945 "claimed": false, 00:05:28.945 "zoned": false, 00:05:28.945 "supported_io_types": { 00:05:28.945 "read": true, 00:05:28.945 "write": true, 00:05:28.945 "unmap": true, 00:05:28.945 "write_zeroes": true, 00:05:28.945 "flush": true, 00:05:28.945 "reset": true, 00:05:28.945 "compare": false, 00:05:28.945 "compare_and_write": false, 00:05:28.945 "abort": true, 00:05:28.945 "nvme_admin": false, 00:05:28.945 "nvme_io": false 00:05:28.945 }, 00:05:28.945 "memory_domains": [ 00:05:28.945 { 00:05:28.945 "dma_device_id": "system", 00:05:28.945 "dma_device_type": 1 00:05:28.945 }, 00:05:28.945 { 00:05:28.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:28.945 "dma_device_type": 2 00:05:28.945 } 00:05:28.945 ], 00:05:28.945 "driver_specific": {} 00:05:28.945 } 00:05:28.945 ]' 00:05:28.945 19:12:15 -- rpc/rpc.sh@17 -- # jq length 00:05:29.204 19:12:15 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:29.204 19:12:15 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:29.204 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:29.204 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:29.204 [2024-04-24 19:12:15.967722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:29.204 [2024-04-24 19:12:15.967760] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:29.204 [2024-04-24 19:12:15.967777] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x58efd50 00:05:29.204 [2024-04-24 19:12:15.967787] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:29.204 [2024-04-24 19:12:15.968584] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:29.204 [2024-04-24 19:12:15.968607] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:29.204 Passthru0 00:05:29.204 19:12:15 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:29.204 19:12:15 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:29.204 19:12:15 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:29.204 19:12:15 -- common/autotest_common.sh@10 -- # set +x 00:05:29.204 19:12:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:29.204 19:12:16 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:29.204 { 00:05:29.204 "name": "Malloc2", 00:05:29.204 "aliases": [ 00:05:29.204 "5575d461-1baa-4a9e-8c7b-5392821dbdec" 00:05:29.204 ], 00:05:29.204 "product_name": "Malloc disk", 00:05:29.204 "block_size": 512, 00:05:29.204 "num_blocks": 16384, 00:05:29.204 "uuid": "5575d461-1baa-4a9e-8c7b-5392821dbdec", 00:05:29.204 "assigned_rate_limits": { 00:05:29.204 "rw_ios_per_sec": 0, 00:05:29.204 "rw_mbytes_per_sec": 0, 00:05:29.204 "r_mbytes_per_sec": 0, 00:05:29.204 "w_mbytes_per_sec": 0 00:05:29.204 }, 00:05:29.204 "claimed": true, 00:05:29.204 "claim_type": "exclusive_write", 00:05:29.204 "zoned": false, 00:05:29.204 "supported_io_types": { 00:05:29.204 "read": true, 00:05:29.204 "write": true, 00:05:29.204 "unmap": true, 00:05:29.204 "write_zeroes": true, 00:05:29.204 "flush": true, 00:05:29.204 "reset": true, 00:05:29.204 "compare": false, 00:05:29.204 "compare_and_write": false, 00:05:29.204 "abort": true, 00:05:29.204 "nvme_admin": false, 00:05:29.204 "nvme_io": false 00:05:29.204 }, 00:05:29.204 "memory_domains": [ 00:05:29.204 { 00:05:29.204 "dma_device_id": "system", 00:05:29.204 "dma_device_type": 1 00:05:29.204 }, 00:05:29.204 { 00:05:29.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.204 "dma_device_type": 2 00:05:29.204 } 00:05:29.204 ], 00:05:29.204 "driver_specific": {} 00:05:29.204 }, 00:05:29.204 { 00:05:29.204 "name": "Passthru0", 00:05:29.204 "aliases": [ 00:05:29.204 "d73ef8fd-aae3-53f8-a8f3-10ce69abb71c" 00:05:29.204 ], 00:05:29.204 "product_name": "passthru", 00:05:29.204 "block_size": 512, 00:05:29.204 "num_blocks": 16384, 00:05:29.204 "uuid": "d73ef8fd-aae3-53f8-a8f3-10ce69abb71c", 00:05:29.204 "assigned_rate_limits": { 00:05:29.204 "rw_ios_per_sec": 0, 00:05:29.204 "rw_mbytes_per_sec": 0, 00:05:29.204 "r_mbytes_per_sec": 0, 00:05:29.204 "w_mbytes_per_sec": 0 00:05:29.204 }, 00:05:29.204 "claimed": false, 00:05:29.204 "zoned": false, 00:05:29.204 "supported_io_types": { 00:05:29.204 "read": true, 00:05:29.204 "write": true, 00:05:29.204 "unmap": true, 00:05:29.204 "write_zeroes": true, 00:05:29.204 "flush": true, 00:05:29.204 "reset": true, 00:05:29.204 "compare": false, 00:05:29.204 "compare_and_write": false, 00:05:29.204 "abort": true, 00:05:29.204 "nvme_admin": false, 00:05:29.204 "nvme_io": false 00:05:29.204 }, 00:05:29.204 "memory_domains": [ 00:05:29.204 { 00:05:29.204 "dma_device_id": "system", 00:05:29.204 "dma_device_type": 1 00:05:29.204 }, 00:05:29.204 { 00:05:29.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:29.204 "dma_device_type": 2 00:05:29.204 } 00:05:29.204 ], 00:05:29.204 "driver_specific": { 00:05:29.204 "passthru": { 00:05:29.204 "name": "Passthru0", 00:05:29.204 "base_bdev_name": "Malloc2" 00:05:29.204 } 00:05:29.204 } 00:05:29.204 } 00:05:29.204 ]' 00:05:29.204 19:12:16 -- rpc/rpc.sh@21 -- # jq length 00:05:29.204 19:12:16 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:29.204 19:12:16 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:29.204 19:12:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:29.204 19:12:16 -- common/autotest_common.sh@10 -- # set +x 00:05:29.204 19:12:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:29.204 19:12:16 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:29.204 19:12:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:29.204 19:12:16 -- common/autotest_common.sh@10 -- # set +x 00:05:29.204 19:12:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:29.204 19:12:16 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:29.204 19:12:16 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:29.204 19:12:16 -- common/autotest_common.sh@10 -- # set +x 00:05:29.204 19:12:16 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:29.204 19:12:16 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:29.204 19:12:16 -- rpc/rpc.sh@26 -- # jq length 00:05:29.204 19:12:16 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:29.204 00:05:29.204 real 0m0.277s 00:05:29.204 user 0m0.169s 00:05:29.204 sys 0m0.051s 00:05:29.204 19:12:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:29.204 19:12:16 -- common/autotest_common.sh@10 -- # set +x 00:05:29.204 ************************************ 00:05:29.204 END TEST rpc_daemon_integrity 00:05:29.204 ************************************ 00:05:29.204 19:12:16 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:29.204 19:12:16 -- rpc/rpc.sh@84 -- # killprocess 1602090 00:05:29.204 19:12:16 -- common/autotest_common.sh@936 -- # '[' -z 1602090 ']' 00:05:29.204 19:12:16 -- common/autotest_common.sh@940 -- # kill -0 1602090 00:05:29.204 19:12:16 -- common/autotest_common.sh@941 -- # uname 00:05:29.204 19:12:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:29.204 19:12:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1602090 00:05:29.204 19:12:16 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:29.204 19:12:16 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:29.204 19:12:16 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1602090' 00:05:29.204 killing process with pid 1602090 00:05:29.204 19:12:16 -- common/autotest_common.sh@955 -- # kill 1602090 00:05:29.204 19:12:16 -- common/autotest_common.sh@960 -- # wait 1602090 00:05:29.771 00:05:29.771 real 0m2.967s 00:05:29.771 user 0m3.738s 00:05:29.771 sys 0m1.026s 00:05:29.771 19:12:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:29.771 19:12:16 -- common/autotest_common.sh@10 -- # set +x 00:05:29.771 ************************************ 00:05:29.771 END TEST rpc 00:05:29.771 ************************************ 00:05:29.771 19:12:16 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:29.771 19:12:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:29.771 19:12:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:29.771 19:12:16 -- common/autotest_common.sh@10 -- # set +x 00:05:29.771 ************************************ 00:05:29.771 START TEST skip_rpc 00:05:29.772 ************************************ 00:05:29.772 19:12:16 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:30.030 * Looking for test storage... 00:05:30.030 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:30.030 19:12:16 -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:30.030 19:12:16 -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:05:30.030 19:12:16 -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:30.030 19:12:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:30.030 19:12:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:30.030 19:12:16 -- common/autotest_common.sh@10 -- # set +x 00:05:30.030 ************************************ 00:05:30.030 START TEST skip_rpc 00:05:30.030 ************************************ 00:05:30.030 19:12:16 -- common/autotest_common.sh@1111 -- # test_skip_rpc 00:05:30.030 19:12:16 -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1602665 00:05:30.030 19:12:16 -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:30.030 19:12:16 -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:30.030 19:12:16 -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:30.030 [2024-04-24 19:12:16.992337] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:30.030 [2024-04-24 19:12:16.992402] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1602665 ] 00:05:30.030 EAL: No free 2048 kB hugepages reported on node 1 00:05:30.289 [2024-04-24 19:12:17.067165] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:30.289 [2024-04-24 19:12:17.152958] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.588 19:12:21 -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:35.589 19:12:21 -- common/autotest_common.sh@638 -- # local es=0 00:05:35.589 19:12:21 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:35.589 19:12:21 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:05:35.589 19:12:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:35.589 19:12:21 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:05:35.589 19:12:21 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:35.589 19:12:21 -- common/autotest_common.sh@641 -- # rpc_cmd spdk_get_version 00:05:35.589 19:12:21 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:35.589 19:12:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.589 19:12:21 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:05:35.589 19:12:21 -- common/autotest_common.sh@641 -- # es=1 00:05:35.589 19:12:21 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:35.589 19:12:21 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:35.589 19:12:21 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:35.589 19:12:21 -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:35.589 19:12:21 -- rpc/skip_rpc.sh@23 -- # killprocess 1602665 00:05:35.589 19:12:21 -- common/autotest_common.sh@936 -- # '[' -z 1602665 ']' 00:05:35.589 19:12:21 -- common/autotest_common.sh@940 -- # kill -0 1602665 00:05:35.589 19:12:21 -- common/autotest_common.sh@941 -- # uname 00:05:35.589 19:12:21 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:35.589 19:12:21 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1602665 00:05:35.589 19:12:22 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:35.589 19:12:22 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:35.589 19:12:22 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1602665' 00:05:35.589 killing process with pid 1602665 00:05:35.589 19:12:22 -- common/autotest_common.sh@955 -- # kill 1602665 00:05:35.589 19:12:22 -- common/autotest_common.sh@960 -- # wait 1602665 00:05:35.589 00:05:35.589 real 0m5.373s 00:05:35.589 user 0m5.099s 00:05:35.589 sys 0m0.301s 00:05:35.589 19:12:22 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:35.589 19:12:22 -- common/autotest_common.sh@10 -- # set +x 00:05:35.589 ************************************ 00:05:35.589 END TEST skip_rpc 00:05:35.589 ************************************ 00:05:35.589 19:12:22 -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:35.589 19:12:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:35.589 19:12:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.589 19:12:22 -- common/autotest_common.sh@10 -- # set +x 00:05:35.589 ************************************ 00:05:35.589 START TEST skip_rpc_with_json 00:05:35.589 ************************************ 00:05:35.589 19:12:22 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_json 00:05:35.589 19:12:22 -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:35.589 19:12:22 -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1603420 00:05:35.589 19:12:22 -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:35.589 19:12:22 -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:35.589 19:12:22 -- rpc/skip_rpc.sh@31 -- # waitforlisten 1603420 00:05:35.589 19:12:22 -- common/autotest_common.sh@817 -- # '[' -z 1603420 ']' 00:05:35.589 19:12:22 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.589 19:12:22 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:35.589 19:12:22 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.589 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.589 19:12:22 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:35.589 19:12:22 -- common/autotest_common.sh@10 -- # set +x 00:05:35.589 [2024-04-24 19:12:22.540455] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:35.589 [2024-04-24 19:12:22.540512] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603420 ] 00:05:35.589 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.847 [2024-04-24 19:12:22.616744] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.847 [2024-04-24 19:12:22.708216] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.414 19:12:23 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:36.414 19:12:23 -- common/autotest_common.sh@850 -- # return 0 00:05:36.414 19:12:23 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:36.414 19:12:23 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:36.414 19:12:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.414 [2024-04-24 19:12:23.360301] nvmf_rpc.c:2513:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:36.414 request: 00:05:36.414 { 00:05:36.414 "trtype": "tcp", 00:05:36.414 "method": "nvmf_get_transports", 00:05:36.414 "req_id": 1 00:05:36.414 } 00:05:36.414 Got JSON-RPC error response 00:05:36.414 response: 00:05:36.414 { 00:05:36.414 "code": -19, 00:05:36.414 "message": "No such device" 00:05:36.414 } 00:05:36.414 19:12:23 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:05:36.414 19:12:23 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:36.414 19:12:23 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:36.414 19:12:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.414 [2024-04-24 19:12:23.372379] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:36.414 19:12:23 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:36.414 19:12:23 -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:36.414 19:12:23 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:36.414 19:12:23 -- common/autotest_common.sh@10 -- # set +x 00:05:36.673 19:12:23 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:36.673 19:12:23 -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:36.673 { 00:05:36.673 "subsystems": [ 00:05:36.673 { 00:05:36.673 "subsystem": "scheduler", 00:05:36.673 "config": [ 00:05:36.673 { 00:05:36.673 "method": "framework_set_scheduler", 00:05:36.673 "params": { 00:05:36.673 "name": "static" 00:05:36.673 } 00:05:36.673 } 00:05:36.673 ] 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "vmd", 00:05:36.673 "config": [] 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "sock", 00:05:36.673 "config": [ 00:05:36.673 { 00:05:36.673 "method": "sock_impl_set_options", 00:05:36.673 "params": { 00:05:36.673 "impl_name": "posix", 00:05:36.673 "recv_buf_size": 2097152, 00:05:36.673 "send_buf_size": 2097152, 00:05:36.673 "enable_recv_pipe": true, 00:05:36.673 "enable_quickack": false, 00:05:36.673 "enable_placement_id": 0, 00:05:36.673 "enable_zerocopy_send_server": true, 00:05:36.673 "enable_zerocopy_send_client": false, 00:05:36.673 "zerocopy_threshold": 0, 00:05:36.673 "tls_version": 0, 00:05:36.673 "enable_ktls": false 00:05:36.673 } 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "method": "sock_impl_set_options", 00:05:36.673 "params": { 00:05:36.673 "impl_name": "ssl", 00:05:36.673 "recv_buf_size": 4096, 00:05:36.673 "send_buf_size": 4096, 00:05:36.673 "enable_recv_pipe": true, 00:05:36.673 "enable_quickack": false, 00:05:36.673 "enable_placement_id": 0, 00:05:36.673 "enable_zerocopy_send_server": true, 00:05:36.673 "enable_zerocopy_send_client": false, 00:05:36.673 "zerocopy_threshold": 0, 00:05:36.673 "tls_version": 0, 00:05:36.673 "enable_ktls": false 00:05:36.673 } 00:05:36.673 } 00:05:36.673 ] 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "iobuf", 00:05:36.673 "config": [ 00:05:36.673 { 00:05:36.673 "method": "iobuf_set_options", 00:05:36.673 "params": { 00:05:36.673 "small_pool_count": 8192, 00:05:36.673 "large_pool_count": 1024, 00:05:36.673 "small_bufsize": 8192, 00:05:36.673 "large_bufsize": 135168 00:05:36.673 } 00:05:36.673 } 00:05:36.673 ] 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "keyring", 00:05:36.673 "config": [] 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "vfio_user_target", 00:05:36.673 "config": null 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "accel", 00:05:36.673 "config": [ 00:05:36.673 { 00:05:36.673 "method": "accel_set_options", 00:05:36.673 "params": { 00:05:36.673 "small_cache_size": 128, 00:05:36.673 "large_cache_size": 16, 00:05:36.673 "task_count": 2048, 00:05:36.673 "sequence_count": 2048, 00:05:36.673 "buf_count": 2048 00:05:36.673 } 00:05:36.673 } 00:05:36.673 ] 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "bdev", 00:05:36.673 "config": [ 00:05:36.673 { 00:05:36.673 "method": "bdev_set_options", 00:05:36.673 "params": { 00:05:36.673 "bdev_io_pool_size": 65535, 00:05:36.673 "bdev_io_cache_size": 256, 00:05:36.673 "bdev_auto_examine": true, 00:05:36.673 "iobuf_small_cache_size": 128, 00:05:36.673 "iobuf_large_cache_size": 16 00:05:36.673 } 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "method": "bdev_raid_set_options", 00:05:36.673 "params": { 00:05:36.673 "process_window_size_kb": 1024 00:05:36.673 } 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "method": "bdev_nvme_set_options", 00:05:36.673 "params": { 00:05:36.673 "action_on_timeout": "none", 00:05:36.673 "timeout_us": 0, 00:05:36.673 "timeout_admin_us": 0, 00:05:36.673 "keep_alive_timeout_ms": 10000, 00:05:36.673 "arbitration_burst": 0, 00:05:36.673 "low_priority_weight": 0, 00:05:36.673 "medium_priority_weight": 0, 00:05:36.673 "high_priority_weight": 0, 00:05:36.673 "nvme_adminq_poll_period_us": 10000, 00:05:36.673 "nvme_ioq_poll_period_us": 0, 00:05:36.673 "io_queue_requests": 0, 00:05:36.673 "delay_cmd_submit": true, 00:05:36.673 "transport_retry_count": 4, 00:05:36.673 "bdev_retry_count": 3, 00:05:36.673 "transport_ack_timeout": 0, 00:05:36.673 "ctrlr_loss_timeout_sec": 0, 00:05:36.673 "reconnect_delay_sec": 0, 00:05:36.673 "fast_io_fail_timeout_sec": 0, 00:05:36.673 "disable_auto_failback": false, 00:05:36.673 "generate_uuids": false, 00:05:36.673 "transport_tos": 0, 00:05:36.673 "nvme_error_stat": false, 00:05:36.673 "rdma_srq_size": 0, 00:05:36.673 "io_path_stat": false, 00:05:36.673 "allow_accel_sequence": false, 00:05:36.673 "rdma_max_cq_size": 0, 00:05:36.673 "rdma_cm_event_timeout_ms": 0, 00:05:36.673 "dhchap_digests": [ 00:05:36.673 "sha256", 00:05:36.673 "sha384", 00:05:36.673 "sha512" 00:05:36.673 ], 00:05:36.673 "dhchap_dhgroups": [ 00:05:36.673 "null", 00:05:36.673 "ffdhe2048", 00:05:36.673 "ffdhe3072", 00:05:36.673 "ffdhe4096", 00:05:36.673 "ffdhe6144", 00:05:36.673 "ffdhe8192" 00:05:36.673 ] 00:05:36.673 } 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "method": "bdev_nvme_set_hotplug", 00:05:36.673 "params": { 00:05:36.673 "period_us": 100000, 00:05:36.673 "enable": false 00:05:36.673 } 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "method": "bdev_iscsi_set_options", 00:05:36.673 "params": { 00:05:36.673 "timeout_sec": 30 00:05:36.673 } 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "method": "bdev_wait_for_examine" 00:05:36.673 } 00:05:36.673 ] 00:05:36.673 }, 00:05:36.673 { 00:05:36.673 "subsystem": "nvmf", 00:05:36.673 "config": [ 00:05:36.673 { 00:05:36.674 "method": "nvmf_set_config", 00:05:36.674 "params": { 00:05:36.674 "discovery_filter": "match_any", 00:05:36.674 "admin_cmd_passthru": { 00:05:36.674 "identify_ctrlr": false 00:05:36.674 } 00:05:36.674 } 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "method": "nvmf_set_max_subsystems", 00:05:36.674 "params": { 00:05:36.674 "max_subsystems": 1024 00:05:36.674 } 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "method": "nvmf_set_crdt", 00:05:36.674 "params": { 00:05:36.674 "crdt1": 0, 00:05:36.674 "crdt2": 0, 00:05:36.674 "crdt3": 0 00:05:36.674 } 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "method": "nvmf_create_transport", 00:05:36.674 "params": { 00:05:36.674 "trtype": "TCP", 00:05:36.674 "max_queue_depth": 128, 00:05:36.674 "max_io_qpairs_per_ctrlr": 127, 00:05:36.674 "in_capsule_data_size": 4096, 00:05:36.674 "max_io_size": 131072, 00:05:36.674 "io_unit_size": 131072, 00:05:36.674 "max_aq_depth": 128, 00:05:36.674 "num_shared_buffers": 511, 00:05:36.674 "buf_cache_size": 4294967295, 00:05:36.674 "dif_insert_or_strip": false, 00:05:36.674 "zcopy": false, 00:05:36.674 "c2h_success": true, 00:05:36.674 "sock_priority": 0, 00:05:36.674 "abort_timeout_sec": 1, 00:05:36.674 "ack_timeout": 0, 00:05:36.674 "data_wr_pool_size": 0 00:05:36.674 } 00:05:36.674 } 00:05:36.674 ] 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "subsystem": "nbd", 00:05:36.674 "config": [] 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "subsystem": "ublk", 00:05:36.674 "config": [] 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "subsystem": "vhost_blk", 00:05:36.674 "config": [] 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "subsystem": "scsi", 00:05:36.674 "config": null 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "subsystem": "iscsi", 00:05:36.674 "config": [ 00:05:36.674 { 00:05:36.674 "method": "iscsi_set_options", 00:05:36.674 "params": { 00:05:36.674 "node_base": "iqn.2016-06.io.spdk", 00:05:36.674 "max_sessions": 128, 00:05:36.674 "max_connections_per_session": 2, 00:05:36.674 "max_queue_depth": 64, 00:05:36.674 "default_time2wait": 2, 00:05:36.674 "default_time2retain": 20, 00:05:36.674 "first_burst_length": 8192, 00:05:36.674 "immediate_data": true, 00:05:36.674 "allow_duplicated_isid": false, 00:05:36.674 "error_recovery_level": 0, 00:05:36.674 "nop_timeout": 60, 00:05:36.674 "nop_in_interval": 30, 00:05:36.674 "disable_chap": false, 00:05:36.674 "require_chap": false, 00:05:36.674 "mutual_chap": false, 00:05:36.674 "chap_group": 0, 00:05:36.674 "max_large_datain_per_connection": 64, 00:05:36.674 "max_r2t_per_connection": 4, 00:05:36.674 "pdu_pool_size": 36864, 00:05:36.674 "immediate_data_pool_size": 16384, 00:05:36.674 "data_out_pool_size": 2048 00:05:36.674 } 00:05:36.674 } 00:05:36.674 ] 00:05:36.674 }, 00:05:36.674 { 00:05:36.674 "subsystem": "vhost_scsi", 00:05:36.674 "config": [] 00:05:36.674 } 00:05:36.674 ] 00:05:36.674 } 00:05:36.674 19:12:23 -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:36.674 19:12:23 -- rpc/skip_rpc.sh@40 -- # killprocess 1603420 00:05:36.674 19:12:23 -- common/autotest_common.sh@936 -- # '[' -z 1603420 ']' 00:05:36.674 19:12:23 -- common/autotest_common.sh@940 -- # kill -0 1603420 00:05:36.674 19:12:23 -- common/autotest_common.sh@941 -- # uname 00:05:36.674 19:12:23 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:36.674 19:12:23 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1603420 00:05:36.674 19:12:23 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:36.674 19:12:23 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:36.674 19:12:23 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1603420' 00:05:36.674 killing process with pid 1603420 00:05:36.674 19:12:23 -- common/autotest_common.sh@955 -- # kill 1603420 00:05:36.674 19:12:23 -- common/autotest_common.sh@960 -- # wait 1603420 00:05:36.932 19:12:23 -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1603637 00:05:36.932 19:12:23 -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:36.932 19:12:23 -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:42.197 19:12:28 -- rpc/skip_rpc.sh@50 -- # killprocess 1603637 00:05:42.197 19:12:28 -- common/autotest_common.sh@936 -- # '[' -z 1603637 ']' 00:05:42.197 19:12:28 -- common/autotest_common.sh@940 -- # kill -0 1603637 00:05:42.197 19:12:28 -- common/autotest_common.sh@941 -- # uname 00:05:42.197 19:12:28 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:42.197 19:12:28 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1603637 00:05:42.197 19:12:28 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:42.197 19:12:28 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:42.197 19:12:28 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1603637' 00:05:42.197 killing process with pid 1603637 00:05:42.197 19:12:28 -- common/autotest_common.sh@955 -- # kill 1603637 00:05:42.197 19:12:28 -- common/autotest_common.sh@960 -- # wait 1603637 00:05:42.455 19:12:29 -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:05:42.456 19:12:29 -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:05:42.456 00:05:42.456 real 0m6.811s 00:05:42.456 user 0m6.540s 00:05:42.456 sys 0m0.700s 00:05:42.456 19:12:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:42.456 19:12:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.456 ************************************ 00:05:42.456 END TEST skip_rpc_with_json 00:05:42.456 ************************************ 00:05:42.456 19:12:29 -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:42.456 19:12:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.456 19:12:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.456 19:12:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.715 ************************************ 00:05:42.715 START TEST skip_rpc_with_delay 00:05:42.715 ************************************ 00:05:42.715 19:12:29 -- common/autotest_common.sh@1111 -- # test_skip_rpc_with_delay 00:05:42.715 19:12:29 -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:42.715 19:12:29 -- common/autotest_common.sh@638 -- # local es=0 00:05:42.715 19:12:29 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:42.715 19:12:29 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.715 19:12:29 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:42.715 19:12:29 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.715 19:12:29 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:42.715 19:12:29 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.715 19:12:29 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:42.715 19:12:29 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.715 19:12:29 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:42.715 19:12:29 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:42.715 [2024-04-24 19:12:29.526726] app.c: 751:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:42.715 [2024-04-24 19:12:29.526800] app.c: 630:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:42.715 19:12:29 -- common/autotest_common.sh@641 -- # es=1 00:05:42.715 19:12:29 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:42.715 19:12:29 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:05:42.715 19:12:29 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:42.715 00:05:42.715 real 0m0.029s 00:05:42.715 user 0m0.017s 00:05:42.715 sys 0m0.013s 00:05:42.715 19:12:29 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:42.715 19:12:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.715 ************************************ 00:05:42.715 END TEST skip_rpc_with_delay 00:05:42.715 ************************************ 00:05:42.715 19:12:29 -- rpc/skip_rpc.sh@77 -- # uname 00:05:42.715 19:12:29 -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:42.715 19:12:29 -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:42.715 19:12:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.715 19:12:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.715 19:12:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.715 ************************************ 00:05:42.715 START TEST exit_on_failed_rpc_init 00:05:42.715 ************************************ 00:05:42.715 19:12:29 -- common/autotest_common.sh@1111 -- # test_exit_on_failed_rpc_init 00:05:42.715 19:12:29 -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1604524 00:05:42.715 19:12:29 -- rpc/skip_rpc.sh@63 -- # waitforlisten 1604524 00:05:42.715 19:12:29 -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:42.715 19:12:29 -- common/autotest_common.sh@817 -- # '[' -z 1604524 ']' 00:05:42.715 19:12:29 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.715 19:12:29 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:42.715 19:12:29 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.715 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.715 19:12:29 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:42.715 19:12:29 -- common/autotest_common.sh@10 -- # set +x 00:05:42.973 [2024-04-24 19:12:29.748222] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:42.973 [2024-04-24 19:12:29.748282] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1604524 ] 00:05:42.973 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.973 [2024-04-24 19:12:29.824294] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.973 [2024-04-24 19:12:29.915012] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.542 19:12:30 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:43.542 19:12:30 -- common/autotest_common.sh@850 -- # return 0 00:05:43.542 19:12:30 -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:43.542 19:12:30 -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:43.542 19:12:30 -- common/autotest_common.sh@638 -- # local es=0 00:05:43.542 19:12:30 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:43.542 19:12:30 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:43.542 19:12:30 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:43.542 19:12:30 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:43.542 19:12:30 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:43.542 19:12:30 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:43.542 19:12:30 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:05:43.542 19:12:30 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:43.542 19:12:30 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:43.801 19:12:30 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:43.801 [2024-04-24 19:12:30.582853] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:43.801 [2024-04-24 19:12:30.582949] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1604574 ] 00:05:43.801 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.801 [2024-04-24 19:12:30.660089] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.801 [2024-04-24 19:12:30.744331] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:43.801 [2024-04-24 19:12:30.744441] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:43.801 [2024-04-24 19:12:30.744458] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:43.801 [2024-04-24 19:12:30.744472] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:44.060 19:12:30 -- common/autotest_common.sh@641 -- # es=234 00:05:44.060 19:12:30 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:05:44.060 19:12:30 -- common/autotest_common.sh@650 -- # es=106 00:05:44.060 19:12:30 -- common/autotest_common.sh@651 -- # case "$es" in 00:05:44.060 19:12:30 -- common/autotest_common.sh@658 -- # es=1 00:05:44.060 19:12:30 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:05:44.060 19:12:30 -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:44.060 19:12:30 -- rpc/skip_rpc.sh@70 -- # killprocess 1604524 00:05:44.060 19:12:30 -- common/autotest_common.sh@936 -- # '[' -z 1604524 ']' 00:05:44.060 19:12:30 -- common/autotest_common.sh@940 -- # kill -0 1604524 00:05:44.060 19:12:30 -- common/autotest_common.sh@941 -- # uname 00:05:44.060 19:12:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:44.060 19:12:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1604524 00:05:44.060 19:12:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:44.060 19:12:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:44.060 19:12:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1604524' 00:05:44.060 killing process with pid 1604524 00:05:44.060 19:12:30 -- common/autotest_common.sh@955 -- # kill 1604524 00:05:44.060 19:12:30 -- common/autotest_common.sh@960 -- # wait 1604524 00:05:44.319 00:05:44.319 real 0m1.490s 00:05:44.319 user 0m1.645s 00:05:44.319 sys 0m0.461s 00:05:44.319 19:12:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:44.319 19:12:31 -- common/autotest_common.sh@10 -- # set +x 00:05:44.319 ************************************ 00:05:44.319 END TEST exit_on_failed_rpc_init 00:05:44.319 ************************************ 00:05:44.319 19:12:31 -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:44.319 00:05:44.319 real 0m14.540s 00:05:44.319 user 0m13.589s 00:05:44.319 sys 0m1.997s 00:05:44.319 19:12:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:44.319 19:12:31 -- common/autotest_common.sh@10 -- # set +x 00:05:44.319 ************************************ 00:05:44.319 END TEST skip_rpc 00:05:44.319 ************************************ 00:05:44.319 19:12:31 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:44.319 19:12:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.319 19:12:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.319 19:12:31 -- common/autotest_common.sh@10 -- # set +x 00:05:44.577 ************************************ 00:05:44.577 START TEST rpc_client 00:05:44.577 ************************************ 00:05:44.577 19:12:31 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:44.577 * Looking for test storage... 00:05:44.577 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:44.577 19:12:31 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:44.577 OK 00:05:44.577 19:12:31 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:44.577 00:05:44.577 real 0m0.108s 00:05:44.577 user 0m0.033s 00:05:44.577 sys 0m0.080s 00:05:44.577 19:12:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:44.577 19:12:31 -- common/autotest_common.sh@10 -- # set +x 00:05:44.577 ************************************ 00:05:44.577 END TEST rpc_client 00:05:44.577 ************************************ 00:05:44.836 19:12:31 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:44.836 19:12:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.836 19:12:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.836 19:12:31 -- common/autotest_common.sh@10 -- # set +x 00:05:44.836 ************************************ 00:05:44.836 START TEST json_config 00:05:44.836 ************************************ 00:05:44.836 19:12:31 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:44.836 19:12:31 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:44.836 19:12:31 -- nvmf/common.sh@7 -- # uname -s 00:05:44.836 19:12:31 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:44.836 19:12:31 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:44.836 19:12:31 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:44.836 19:12:31 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:44.836 19:12:31 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:44.836 19:12:31 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:44.836 19:12:31 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:44.836 19:12:31 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:44.836 19:12:31 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:45.095 19:12:31 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:45.095 19:12:31 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:05:45.095 19:12:31 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:05:45.095 19:12:31 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:45.095 19:12:31 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:45.095 19:12:31 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:45.095 19:12:31 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:45.095 19:12:31 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:45.095 19:12:31 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:45.095 19:12:31 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:45.095 19:12:31 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:45.096 19:12:31 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.096 19:12:31 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.096 19:12:31 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.096 19:12:31 -- paths/export.sh@5 -- # export PATH 00:05:45.096 19:12:31 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.096 19:12:31 -- nvmf/common.sh@47 -- # : 0 00:05:45.096 19:12:31 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:45.096 19:12:31 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:45.096 19:12:31 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:45.096 19:12:31 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:45.096 19:12:31 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:45.096 19:12:31 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:45.096 19:12:31 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:45.096 19:12:31 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:45.096 19:12:31 -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:05:45.096 19:12:31 -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:45.096 19:12:31 -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:45.096 19:12:31 -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:45.096 19:12:31 -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:45.096 19:12:31 -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:45.096 WARNING: No tests are enabled so not running JSON configuration tests 00:05:45.096 19:12:31 -- json_config/json_config.sh@28 -- # exit 0 00:05:45.096 00:05:45.096 real 0m0.115s 00:05:45.096 user 0m0.061s 00:05:45.096 sys 0m0.055s 00:05:45.096 19:12:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:45.096 19:12:31 -- common/autotest_common.sh@10 -- # set +x 00:05:45.096 ************************************ 00:05:45.096 END TEST json_config 00:05:45.096 ************************************ 00:05:45.096 19:12:31 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:45.096 19:12:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.096 19:12:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.096 19:12:31 -- common/autotest_common.sh@10 -- # set +x 00:05:45.096 ************************************ 00:05:45.096 START TEST json_config_extra_key 00:05:45.096 ************************************ 00:05:45.096 19:12:32 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:45.356 19:12:32 -- nvmf/common.sh@7 -- # uname -s 00:05:45.356 19:12:32 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:45.356 19:12:32 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:45.356 19:12:32 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:45.356 19:12:32 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:45.356 19:12:32 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:45.356 19:12:32 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:45.356 19:12:32 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:45.356 19:12:32 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:45.356 19:12:32 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:45.356 19:12:32 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:45.356 19:12:32 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:05:45.356 19:12:32 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:05:45.356 19:12:32 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:45.356 19:12:32 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:45.356 19:12:32 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:45.356 19:12:32 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:45.356 19:12:32 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:45.356 19:12:32 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:45.356 19:12:32 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:45.356 19:12:32 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:45.356 19:12:32 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.356 19:12:32 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.356 19:12:32 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.356 19:12:32 -- paths/export.sh@5 -- # export PATH 00:05:45.356 19:12:32 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:45.356 19:12:32 -- nvmf/common.sh@47 -- # : 0 00:05:45.356 19:12:32 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:45.356 19:12:32 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:45.356 19:12:32 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:45.356 19:12:32 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:45.356 19:12:32 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:45.356 19:12:32 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:45.356 19:12:32 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:45.356 19:12:32 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:05:45.356 INFO: launching applications... 00:05:45.356 19:12:32 -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:45.356 19:12:32 -- json_config/common.sh@9 -- # local app=target 00:05:45.356 19:12:32 -- json_config/common.sh@10 -- # shift 00:05:45.356 19:12:32 -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:45.356 19:12:32 -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:45.356 19:12:32 -- json_config/common.sh@15 -- # local app_extra_params= 00:05:45.356 19:12:32 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:45.356 19:12:32 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:45.356 19:12:32 -- json_config/common.sh@22 -- # app_pid["$app"]=1605042 00:05:45.356 19:12:32 -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:45.356 Waiting for target to run... 00:05:45.356 19:12:32 -- json_config/common.sh@25 -- # waitforlisten 1605042 /var/tmp/spdk_tgt.sock 00:05:45.356 19:12:32 -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:45.356 19:12:32 -- common/autotest_common.sh@817 -- # '[' -z 1605042 ']' 00:05:45.356 19:12:32 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:45.356 19:12:32 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:45.356 19:12:32 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:45.356 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:45.356 19:12:32 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:45.356 19:12:32 -- common/autotest_common.sh@10 -- # set +x 00:05:45.356 [2024-04-24 19:12:32.213963] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:45.356 [2024-04-24 19:12:32.214034] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605042 ] 00:05:45.356 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.924 [2024-04-24 19:12:32.718046] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.924 [2024-04-24 19:12:32.805070] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.181 19:12:33 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:46.181 19:12:33 -- common/autotest_common.sh@850 -- # return 0 00:05:46.181 19:12:33 -- json_config/common.sh@26 -- # echo '' 00:05:46.181 00:05:46.181 19:12:33 -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:05:46.181 INFO: shutting down applications... 00:05:46.181 19:12:33 -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:05:46.181 19:12:33 -- json_config/common.sh@31 -- # local app=target 00:05:46.181 19:12:33 -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:46.181 19:12:33 -- json_config/common.sh@35 -- # [[ -n 1605042 ]] 00:05:46.181 19:12:33 -- json_config/common.sh@38 -- # kill -SIGINT 1605042 00:05:46.181 19:12:33 -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:46.181 19:12:33 -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:46.181 19:12:33 -- json_config/common.sh@41 -- # kill -0 1605042 00:05:46.182 19:12:33 -- json_config/common.sh@45 -- # sleep 0.5 00:05:46.749 19:12:33 -- json_config/common.sh@40 -- # (( i++ )) 00:05:46.749 19:12:33 -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:46.749 19:12:33 -- json_config/common.sh@41 -- # kill -0 1605042 00:05:46.749 19:12:33 -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:46.749 19:12:33 -- json_config/common.sh@43 -- # break 00:05:46.749 19:12:33 -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:46.749 19:12:33 -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:46.749 SPDK target shutdown done 00:05:46.749 19:12:33 -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:05:46.749 Success 00:05:46.749 00:05:46.749 real 0m1.462s 00:05:46.749 user 0m1.011s 00:05:46.749 sys 0m0.623s 00:05:46.749 19:12:33 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:46.749 19:12:33 -- common/autotest_common.sh@10 -- # set +x 00:05:46.749 ************************************ 00:05:46.749 END TEST json_config_extra_key 00:05:46.749 ************************************ 00:05:46.749 19:12:33 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:46.749 19:12:33 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:46.750 19:12:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.750 19:12:33 -- common/autotest_common.sh@10 -- # set +x 00:05:46.750 ************************************ 00:05:46.750 START TEST alias_rpc 00:05:46.750 ************************************ 00:05:46.750 19:12:33 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:47.008 * Looking for test storage... 00:05:47.008 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:47.008 19:12:33 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:47.008 19:12:33 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1605277 00:05:47.008 19:12:33 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1605277 00:05:47.008 19:12:33 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:47.008 19:12:33 -- common/autotest_common.sh@817 -- # '[' -z 1605277 ']' 00:05:47.008 19:12:33 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:47.008 19:12:33 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:47.008 19:12:33 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:47.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:47.008 19:12:33 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:47.008 19:12:33 -- common/autotest_common.sh@10 -- # set +x 00:05:47.008 [2024-04-24 19:12:33.857970] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:47.008 [2024-04-24 19:12:33.858046] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605277 ] 00:05:47.008 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.008 [2024-04-24 19:12:33.935348] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.008 [2024-04-24 19:12:34.016587] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.948 19:12:34 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:47.948 19:12:34 -- common/autotest_common.sh@850 -- # return 0 00:05:47.948 19:12:34 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:47.948 19:12:34 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1605277 00:05:47.948 19:12:34 -- common/autotest_common.sh@936 -- # '[' -z 1605277 ']' 00:05:47.948 19:12:34 -- common/autotest_common.sh@940 -- # kill -0 1605277 00:05:47.948 19:12:34 -- common/autotest_common.sh@941 -- # uname 00:05:47.948 19:12:34 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:47.948 19:12:34 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1605277 00:05:47.949 19:12:34 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:47.949 19:12:34 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:47.949 19:12:34 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1605277' 00:05:47.949 killing process with pid 1605277 00:05:47.949 19:12:34 -- common/autotest_common.sh@955 -- # kill 1605277 00:05:47.949 19:12:34 -- common/autotest_common.sh@960 -- # wait 1605277 00:05:48.515 00:05:48.515 real 0m1.535s 00:05:48.515 user 0m1.611s 00:05:48.515 sys 0m0.472s 00:05:48.515 19:12:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:48.515 19:12:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.515 ************************************ 00:05:48.515 END TEST alias_rpc 00:05:48.515 ************************************ 00:05:48.515 19:12:35 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:05:48.515 19:12:35 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:48.515 19:12:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:48.515 19:12:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:48.515 19:12:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.515 ************************************ 00:05:48.515 START TEST spdkcli_tcp 00:05:48.515 ************************************ 00:05:48.515 19:12:35 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:48.773 * Looking for test storage... 00:05:48.773 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:48.773 19:12:35 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:48.773 19:12:35 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:48.773 19:12:35 -- common/autotest_common.sh@710 -- # xtrace_disable 00:05:48.773 19:12:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1605529 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@27 -- # waitforlisten 1605529 00:05:48.773 19:12:35 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:48.773 19:12:35 -- common/autotest_common.sh@817 -- # '[' -z 1605529 ']' 00:05:48.773 19:12:35 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.773 19:12:35 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:48.773 19:12:35 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.773 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.773 19:12:35 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:48.773 19:12:35 -- common/autotest_common.sh@10 -- # set +x 00:05:48.773 [2024-04-24 19:12:35.584662] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:48.773 [2024-04-24 19:12:35.584727] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605529 ] 00:05:48.773 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.773 [2024-04-24 19:12:35.660715] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:48.773 [2024-04-24 19:12:35.742227] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:48.773 [2024-04-24 19:12:35.742229] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.707 19:12:36 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:49.707 19:12:36 -- common/autotest_common.sh@850 -- # return 0 00:05:49.707 19:12:36 -- spdkcli/tcp.sh@31 -- # socat_pid=1605700 00:05:49.707 19:12:36 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:49.707 19:12:36 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:49.707 [ 00:05:49.707 "spdk_get_version", 00:05:49.707 "rpc_get_methods", 00:05:49.707 "trace_get_info", 00:05:49.707 "trace_get_tpoint_group_mask", 00:05:49.707 "trace_disable_tpoint_group", 00:05:49.707 "trace_enable_tpoint_group", 00:05:49.707 "trace_clear_tpoint_mask", 00:05:49.707 "trace_set_tpoint_mask", 00:05:49.707 "vfu_tgt_set_base_path", 00:05:49.707 "framework_get_pci_devices", 00:05:49.707 "framework_get_config", 00:05:49.707 "framework_get_subsystems", 00:05:49.707 "keyring_get_keys", 00:05:49.707 "iobuf_get_stats", 00:05:49.707 "iobuf_set_options", 00:05:49.707 "sock_set_default_impl", 00:05:49.707 "sock_impl_set_options", 00:05:49.707 "sock_impl_get_options", 00:05:49.707 "vmd_rescan", 00:05:49.707 "vmd_remove_device", 00:05:49.707 "vmd_enable", 00:05:49.707 "accel_get_stats", 00:05:49.707 "accel_set_options", 00:05:49.707 "accel_set_driver", 00:05:49.707 "accel_crypto_key_destroy", 00:05:49.707 "accel_crypto_keys_get", 00:05:49.707 "accel_crypto_key_create", 00:05:49.707 "accel_assign_opc", 00:05:49.707 "accel_get_module_info", 00:05:49.707 "accel_get_opc_assignments", 00:05:49.707 "notify_get_notifications", 00:05:49.707 "notify_get_types", 00:05:49.707 "bdev_get_histogram", 00:05:49.707 "bdev_enable_histogram", 00:05:49.707 "bdev_set_qos_limit", 00:05:49.707 "bdev_set_qd_sampling_period", 00:05:49.707 "bdev_get_bdevs", 00:05:49.707 "bdev_reset_iostat", 00:05:49.707 "bdev_get_iostat", 00:05:49.707 "bdev_examine", 00:05:49.707 "bdev_wait_for_examine", 00:05:49.707 "bdev_set_options", 00:05:49.707 "scsi_get_devices", 00:05:49.707 "thread_set_cpumask", 00:05:49.708 "framework_get_scheduler", 00:05:49.708 "framework_set_scheduler", 00:05:49.708 "framework_get_reactors", 00:05:49.708 "thread_get_io_channels", 00:05:49.708 "thread_get_pollers", 00:05:49.708 "thread_get_stats", 00:05:49.708 "framework_monitor_context_switch", 00:05:49.708 "spdk_kill_instance", 00:05:49.708 "log_enable_timestamps", 00:05:49.708 "log_get_flags", 00:05:49.708 "log_clear_flag", 00:05:49.708 "log_set_flag", 00:05:49.708 "log_get_level", 00:05:49.708 "log_set_level", 00:05:49.708 "log_get_print_level", 00:05:49.708 "log_set_print_level", 00:05:49.708 "framework_enable_cpumask_locks", 00:05:49.708 "framework_disable_cpumask_locks", 00:05:49.708 "framework_wait_init", 00:05:49.708 "framework_start_init", 00:05:49.708 "virtio_blk_create_transport", 00:05:49.708 "virtio_blk_get_transports", 00:05:49.708 "vhost_controller_set_coalescing", 00:05:49.708 "vhost_get_controllers", 00:05:49.708 "vhost_delete_controller", 00:05:49.708 "vhost_create_blk_controller", 00:05:49.708 "vhost_scsi_controller_remove_target", 00:05:49.708 "vhost_scsi_controller_add_target", 00:05:49.708 "vhost_start_scsi_controller", 00:05:49.708 "vhost_create_scsi_controller", 00:05:49.708 "ublk_recover_disk", 00:05:49.708 "ublk_get_disks", 00:05:49.708 "ublk_stop_disk", 00:05:49.708 "ublk_start_disk", 00:05:49.708 "ublk_destroy_target", 00:05:49.708 "ublk_create_target", 00:05:49.708 "nbd_get_disks", 00:05:49.708 "nbd_stop_disk", 00:05:49.708 "nbd_start_disk", 00:05:49.708 "env_dpdk_get_mem_stats", 00:05:49.708 "nvmf_subsystem_get_listeners", 00:05:49.708 "nvmf_subsystem_get_qpairs", 00:05:49.708 "nvmf_subsystem_get_controllers", 00:05:49.708 "nvmf_get_stats", 00:05:49.708 "nvmf_get_transports", 00:05:49.708 "nvmf_create_transport", 00:05:49.708 "nvmf_get_targets", 00:05:49.708 "nvmf_delete_target", 00:05:49.708 "nvmf_create_target", 00:05:49.708 "nvmf_subsystem_allow_any_host", 00:05:49.708 "nvmf_subsystem_remove_host", 00:05:49.708 "nvmf_subsystem_add_host", 00:05:49.708 "nvmf_ns_remove_host", 00:05:49.708 "nvmf_ns_add_host", 00:05:49.708 "nvmf_subsystem_remove_ns", 00:05:49.708 "nvmf_subsystem_add_ns", 00:05:49.708 "nvmf_subsystem_listener_set_ana_state", 00:05:49.708 "nvmf_discovery_get_referrals", 00:05:49.708 "nvmf_discovery_remove_referral", 00:05:49.708 "nvmf_discovery_add_referral", 00:05:49.708 "nvmf_subsystem_remove_listener", 00:05:49.708 "nvmf_subsystem_add_listener", 00:05:49.708 "nvmf_delete_subsystem", 00:05:49.708 "nvmf_create_subsystem", 00:05:49.708 "nvmf_get_subsystems", 00:05:49.708 "nvmf_set_crdt", 00:05:49.708 "nvmf_set_config", 00:05:49.708 "nvmf_set_max_subsystems", 00:05:49.708 "iscsi_get_histogram", 00:05:49.708 "iscsi_enable_histogram", 00:05:49.708 "iscsi_set_options", 00:05:49.708 "iscsi_get_auth_groups", 00:05:49.708 "iscsi_auth_group_remove_secret", 00:05:49.708 "iscsi_auth_group_add_secret", 00:05:49.708 "iscsi_delete_auth_group", 00:05:49.708 "iscsi_create_auth_group", 00:05:49.708 "iscsi_set_discovery_auth", 00:05:49.708 "iscsi_get_options", 00:05:49.708 "iscsi_target_node_request_logout", 00:05:49.708 "iscsi_target_node_set_redirect", 00:05:49.708 "iscsi_target_node_set_auth", 00:05:49.708 "iscsi_target_node_add_lun", 00:05:49.708 "iscsi_get_stats", 00:05:49.708 "iscsi_get_connections", 00:05:49.708 "iscsi_portal_group_set_auth", 00:05:49.708 "iscsi_start_portal_group", 00:05:49.708 "iscsi_delete_portal_group", 00:05:49.708 "iscsi_create_portal_group", 00:05:49.708 "iscsi_get_portal_groups", 00:05:49.708 "iscsi_delete_target_node", 00:05:49.708 "iscsi_target_node_remove_pg_ig_maps", 00:05:49.708 "iscsi_target_node_add_pg_ig_maps", 00:05:49.708 "iscsi_create_target_node", 00:05:49.708 "iscsi_get_target_nodes", 00:05:49.708 "iscsi_delete_initiator_group", 00:05:49.708 "iscsi_initiator_group_remove_initiators", 00:05:49.708 "iscsi_initiator_group_add_initiators", 00:05:49.708 "iscsi_create_initiator_group", 00:05:49.708 "iscsi_get_initiator_groups", 00:05:49.708 "keyring_file_remove_key", 00:05:49.708 "keyring_file_add_key", 00:05:49.708 "vfu_virtio_create_scsi_endpoint", 00:05:49.708 "vfu_virtio_scsi_remove_target", 00:05:49.708 "vfu_virtio_scsi_add_target", 00:05:49.708 "vfu_virtio_create_blk_endpoint", 00:05:49.708 "vfu_virtio_delete_endpoint", 00:05:49.708 "iaa_scan_accel_module", 00:05:49.708 "dsa_scan_accel_module", 00:05:49.708 "ioat_scan_accel_module", 00:05:49.708 "accel_error_inject_error", 00:05:49.708 "bdev_iscsi_delete", 00:05:49.708 "bdev_iscsi_create", 00:05:49.708 "bdev_iscsi_set_options", 00:05:49.708 "bdev_virtio_attach_controller", 00:05:49.708 "bdev_virtio_scsi_get_devices", 00:05:49.708 "bdev_virtio_detach_controller", 00:05:49.708 "bdev_virtio_blk_set_hotplug", 00:05:49.708 "bdev_ftl_set_property", 00:05:49.708 "bdev_ftl_get_properties", 00:05:49.708 "bdev_ftl_get_stats", 00:05:49.708 "bdev_ftl_unmap", 00:05:49.708 "bdev_ftl_unload", 00:05:49.708 "bdev_ftl_delete", 00:05:49.708 "bdev_ftl_load", 00:05:49.708 "bdev_ftl_create", 00:05:49.708 "bdev_aio_delete", 00:05:49.708 "bdev_aio_rescan", 00:05:49.708 "bdev_aio_create", 00:05:49.708 "blobfs_create", 00:05:49.708 "blobfs_detect", 00:05:49.708 "blobfs_set_cache_size", 00:05:49.708 "bdev_zone_block_delete", 00:05:49.708 "bdev_zone_block_create", 00:05:49.708 "bdev_delay_delete", 00:05:49.708 "bdev_delay_create", 00:05:49.708 "bdev_delay_update_latency", 00:05:49.708 "bdev_split_delete", 00:05:49.708 "bdev_split_create", 00:05:49.708 "bdev_error_inject_error", 00:05:49.708 "bdev_error_delete", 00:05:49.708 "bdev_error_create", 00:05:49.708 "bdev_raid_set_options", 00:05:49.708 "bdev_raid_remove_base_bdev", 00:05:49.708 "bdev_raid_add_base_bdev", 00:05:49.708 "bdev_raid_delete", 00:05:49.708 "bdev_raid_create", 00:05:49.708 "bdev_raid_get_bdevs", 00:05:49.708 "bdev_lvol_grow_lvstore", 00:05:49.708 "bdev_lvol_get_lvols", 00:05:49.708 "bdev_lvol_get_lvstores", 00:05:49.708 "bdev_lvol_delete", 00:05:49.708 "bdev_lvol_set_read_only", 00:05:49.708 "bdev_lvol_resize", 00:05:49.708 "bdev_lvol_decouple_parent", 00:05:49.708 "bdev_lvol_inflate", 00:05:49.708 "bdev_lvol_rename", 00:05:49.708 "bdev_lvol_clone_bdev", 00:05:49.708 "bdev_lvol_clone", 00:05:49.708 "bdev_lvol_snapshot", 00:05:49.708 "bdev_lvol_create", 00:05:49.708 "bdev_lvol_delete_lvstore", 00:05:49.708 "bdev_lvol_rename_lvstore", 00:05:49.708 "bdev_lvol_create_lvstore", 00:05:49.708 "bdev_passthru_delete", 00:05:49.708 "bdev_passthru_create", 00:05:49.708 "bdev_nvme_cuse_unregister", 00:05:49.708 "bdev_nvme_cuse_register", 00:05:49.708 "bdev_opal_new_user", 00:05:49.708 "bdev_opal_set_lock_state", 00:05:49.708 "bdev_opal_delete", 00:05:49.708 "bdev_opal_get_info", 00:05:49.708 "bdev_opal_create", 00:05:49.708 "bdev_nvme_opal_revert", 00:05:49.708 "bdev_nvme_opal_init", 00:05:49.708 "bdev_nvme_send_cmd", 00:05:49.708 "bdev_nvme_get_path_iostat", 00:05:49.708 "bdev_nvme_get_mdns_discovery_info", 00:05:49.708 "bdev_nvme_stop_mdns_discovery", 00:05:49.708 "bdev_nvme_start_mdns_discovery", 00:05:49.708 "bdev_nvme_set_multipath_policy", 00:05:49.708 "bdev_nvme_set_preferred_path", 00:05:49.708 "bdev_nvme_get_io_paths", 00:05:49.708 "bdev_nvme_remove_error_injection", 00:05:49.708 "bdev_nvme_add_error_injection", 00:05:49.708 "bdev_nvme_get_discovery_info", 00:05:49.708 "bdev_nvme_stop_discovery", 00:05:49.708 "bdev_nvme_start_discovery", 00:05:49.708 "bdev_nvme_get_controller_health_info", 00:05:49.708 "bdev_nvme_disable_controller", 00:05:49.708 "bdev_nvme_enable_controller", 00:05:49.708 "bdev_nvme_reset_controller", 00:05:49.708 "bdev_nvme_get_transport_statistics", 00:05:49.708 "bdev_nvme_apply_firmware", 00:05:49.708 "bdev_nvme_detach_controller", 00:05:49.708 "bdev_nvme_get_controllers", 00:05:49.708 "bdev_nvme_attach_controller", 00:05:49.708 "bdev_nvme_set_hotplug", 00:05:49.708 "bdev_nvme_set_options", 00:05:49.708 "bdev_null_resize", 00:05:49.708 "bdev_null_delete", 00:05:49.708 "bdev_null_create", 00:05:49.708 "bdev_malloc_delete", 00:05:49.708 "bdev_malloc_create" 00:05:49.708 ] 00:05:49.708 19:12:36 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:49.708 19:12:36 -- common/autotest_common.sh@716 -- # xtrace_disable 00:05:49.708 19:12:36 -- common/autotest_common.sh@10 -- # set +x 00:05:49.708 19:12:36 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:49.708 19:12:36 -- spdkcli/tcp.sh@38 -- # killprocess 1605529 00:05:49.708 19:12:36 -- common/autotest_common.sh@936 -- # '[' -z 1605529 ']' 00:05:49.708 19:12:36 -- common/autotest_common.sh@940 -- # kill -0 1605529 00:05:49.708 19:12:36 -- common/autotest_common.sh@941 -- # uname 00:05:49.708 19:12:36 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:49.708 19:12:36 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1605529 00:05:49.708 19:12:36 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:49.708 19:12:36 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:49.708 19:12:36 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1605529' 00:05:49.708 killing process with pid 1605529 00:05:49.708 19:12:36 -- common/autotest_common.sh@955 -- # kill 1605529 00:05:49.708 19:12:36 -- common/autotest_common.sh@960 -- # wait 1605529 00:05:50.273 00:05:50.273 real 0m1.558s 00:05:50.273 user 0m2.825s 00:05:50.273 sys 0m0.495s 00:05:50.273 19:12:37 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:50.273 19:12:37 -- common/autotest_common.sh@10 -- # set +x 00:05:50.273 ************************************ 00:05:50.273 END TEST spdkcli_tcp 00:05:50.273 ************************************ 00:05:50.273 19:12:37 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:50.273 19:12:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.273 19:12:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.273 19:12:37 -- common/autotest_common.sh@10 -- # set +x 00:05:50.273 ************************************ 00:05:50.273 START TEST dpdk_mem_utility 00:05:50.273 ************************************ 00:05:50.273 19:12:37 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:50.534 * Looking for test storage... 00:05:50.534 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:50.534 19:12:37 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:50.534 19:12:37 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1605845 00:05:50.534 19:12:37 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1605845 00:05:50.534 19:12:37 -- common/autotest_common.sh@817 -- # '[' -z 1605845 ']' 00:05:50.534 19:12:37 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:50.534 19:12:37 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:50.534 19:12:37 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:50.534 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:50.534 19:12:37 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:50.534 19:12:37 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:50.534 19:12:37 -- common/autotest_common.sh@10 -- # set +x 00:05:50.534 [2024-04-24 19:12:37.321710] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:50.534 [2024-04-24 19:12:37.321792] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605845 ] 00:05:50.534 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.534 [2024-04-24 19:12:37.398664] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.534 [2024-04-24 19:12:37.494356] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.187 19:12:38 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:51.187 19:12:38 -- common/autotest_common.sh@850 -- # return 0 00:05:51.187 19:12:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:51.187 19:12:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:51.187 19:12:38 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:51.187 19:12:38 -- common/autotest_common.sh@10 -- # set +x 00:05:51.187 { 00:05:51.187 "filename": "/tmp/spdk_mem_dump.txt" 00:05:51.187 } 00:05:51.187 19:12:38 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:51.187 19:12:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:51.187 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:51.187 1 heaps totaling size 814.000000 MiB 00:05:51.187 size: 814.000000 MiB heap id: 0 00:05:51.187 end heaps---------- 00:05:51.187 8 mempools totaling size 598.116089 MiB 00:05:51.187 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:51.187 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:51.187 size: 84.521057 MiB name: bdev_io_1605845 00:05:51.187 size: 51.011292 MiB name: evtpool_1605845 00:05:51.187 size: 50.003479 MiB name: msgpool_1605845 00:05:51.187 size: 21.763794 MiB name: PDU_Pool 00:05:51.187 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:51.187 size: 0.026123 MiB name: Session_Pool 00:05:51.187 end mempools------- 00:05:51.187 6 memzones totaling size 4.142822 MiB 00:05:51.187 size: 1.000366 MiB name: RG_ring_0_1605845 00:05:51.187 size: 1.000366 MiB name: RG_ring_1_1605845 00:05:51.187 size: 1.000366 MiB name: RG_ring_4_1605845 00:05:51.187 size: 1.000366 MiB name: RG_ring_5_1605845 00:05:51.187 size: 0.125366 MiB name: RG_ring_2_1605845 00:05:51.187 size: 0.015991 MiB name: RG_ring_3_1605845 00:05:51.187 end memzones------- 00:05:51.187 19:12:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:51.446 heap id: 0 total size: 814.000000 MiB number of busy elements: 42 number of free elements: 15 00:05:51.446 list of free elements. size: 12.517212 MiB 00:05:51.446 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:51.446 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:51.446 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:51.446 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:51.446 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:51.446 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:51.446 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:51.446 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:51.446 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:51.446 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:51.446 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:51.446 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:51.446 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:51.446 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:51.446 element at address: 0x200003a00000 with size: 0.353394 MiB 00:05:51.446 list of standard malloc elements. size: 199.220215 MiB 00:05:51.446 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:51.446 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:51.446 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:51.446 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:51.447 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:51.447 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:51.447 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:51.447 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:51.447 element at address: 0x200003aff280 with size: 0.002136 MiB 00:05:51.447 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:51.447 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:51.447 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200003a5a780 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200003adaa40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200003adac40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200003adef00 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200003aff1c0 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:51.447 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:51.447 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:51.447 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:51.447 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:51.447 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:51.447 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:51.447 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:51.447 list of memzone associated elements. size: 602.262573 MiB 00:05:51.447 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:51.447 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:51.447 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:51.447 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:51.447 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:51.447 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1605845_0 00:05:51.447 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:51.447 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1605845_0 00:05:51.447 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:51.447 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1605845_0 00:05:51.447 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:51.447 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:51.447 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:51.447 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:51.447 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:51.447 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1605845 00:05:51.447 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:51.447 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1605845 00:05:51.447 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:51.447 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1605845 00:05:51.447 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:51.447 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:51.447 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:51.447 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:51.447 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:51.447 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:51.447 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:51.447 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:51.447 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:51.447 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1605845 00:05:51.447 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:51.447 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1605845 00:05:51.447 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:51.447 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1605845 00:05:51.447 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:51.447 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1605845 00:05:51.447 element at address: 0x200003a5a840 with size: 0.500488 MiB 00:05:51.447 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1605845 00:05:51.447 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:51.447 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:51.447 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:51.447 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:51.447 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:51.447 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:51.447 element at address: 0x200003adefc0 with size: 0.125488 MiB 00:05:51.447 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1605845 00:05:51.447 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:51.447 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:51.447 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:51.447 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:51.447 element at address: 0x200003adad00 with size: 0.016113 MiB 00:05:51.447 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1605845 00:05:51.447 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:51.447 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:51.447 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:51.447 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1605845 00:05:51.447 element at address: 0x200003adab00 with size: 0.000305 MiB 00:05:51.447 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1605845 00:05:51.447 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:51.447 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:51.447 19:12:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:51.447 19:12:38 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1605845 00:05:51.447 19:12:38 -- common/autotest_common.sh@936 -- # '[' -z 1605845 ']' 00:05:51.447 19:12:38 -- common/autotest_common.sh@940 -- # kill -0 1605845 00:05:51.447 19:12:38 -- common/autotest_common.sh@941 -- # uname 00:05:51.447 19:12:38 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:51.447 19:12:38 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1605845 00:05:51.447 19:12:38 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:51.447 19:12:38 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:51.447 19:12:38 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1605845' 00:05:51.447 killing process with pid 1605845 00:05:51.447 19:12:38 -- common/autotest_common.sh@955 -- # kill 1605845 00:05:51.447 19:12:38 -- common/autotest_common.sh@960 -- # wait 1605845 00:05:51.707 00:05:51.707 real 0m1.442s 00:05:51.707 user 0m1.481s 00:05:51.707 sys 0m0.439s 00:05:51.707 19:12:38 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:51.707 19:12:38 -- common/autotest_common.sh@10 -- # set +x 00:05:51.707 ************************************ 00:05:51.707 END TEST dpdk_mem_utility 00:05:51.707 ************************************ 00:05:51.707 19:12:38 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:51.707 19:12:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.707 19:12:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.707 19:12:38 -- common/autotest_common.sh@10 -- # set +x 00:05:51.966 ************************************ 00:05:51.966 START TEST event 00:05:51.966 ************************************ 00:05:51.966 19:12:38 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:51.966 * Looking for test storage... 00:05:51.966 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:51.966 19:12:38 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:51.966 19:12:38 -- bdev/nbd_common.sh@6 -- # set -e 00:05:51.966 19:12:38 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:51.966 19:12:38 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:51.966 19:12:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.966 19:12:38 -- common/autotest_common.sh@10 -- # set +x 00:05:52.225 ************************************ 00:05:52.225 START TEST event_perf 00:05:52.225 ************************************ 00:05:52.225 19:12:39 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:52.225 Running I/O for 1 seconds...[2024-04-24 19:12:39.113898] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:52.226 [2024-04-24 19:12:39.113975] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606198 ] 00:05:52.226 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.226 [2024-04-24 19:12:39.191836] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:52.484 [2024-04-24 19:12:39.278375] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:52.484 [2024-04-24 19:12:39.278463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:52.484 [2024-04-24 19:12:39.278536] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:52.484 [2024-04-24 19:12:39.278538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.420 Running I/O for 1 seconds... 00:05:53.420 lcore 0: 193028 00:05:53.420 lcore 1: 193029 00:05:53.420 lcore 2: 193030 00:05:53.420 lcore 3: 193029 00:05:53.420 done. 00:05:53.420 00:05:53.420 real 0m1.261s 00:05:53.420 user 0m4.159s 00:05:53.420 sys 0m0.098s 00:05:53.420 19:12:40 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:53.420 19:12:40 -- common/autotest_common.sh@10 -- # set +x 00:05:53.420 ************************************ 00:05:53.420 END TEST event_perf 00:05:53.420 ************************************ 00:05:53.420 19:12:40 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:53.420 19:12:40 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:53.420 19:12:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:53.420 19:12:40 -- common/autotest_common.sh@10 -- # set +x 00:05:53.679 ************************************ 00:05:53.679 START TEST event_reactor 00:05:53.679 ************************************ 00:05:53.679 19:12:40 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:53.679 [2024-04-24 19:12:40.557806] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:53.679 [2024-04-24 19:12:40.557897] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606404 ] 00:05:53.679 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.679 [2024-04-24 19:12:40.636205] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.937 [2024-04-24 19:12:40.722748] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.874 test_start 00:05:54.874 oneshot 00:05:54.874 tick 100 00:05:54.874 tick 100 00:05:54.874 tick 250 00:05:54.874 tick 100 00:05:54.874 tick 100 00:05:54.874 tick 100 00:05:54.874 tick 250 00:05:54.874 tick 500 00:05:54.874 tick 100 00:05:54.874 tick 100 00:05:54.874 tick 250 00:05:54.874 tick 100 00:05:54.874 tick 100 00:05:54.874 test_end 00:05:54.874 00:05:54.874 real 0m1.256s 00:05:54.874 user 0m1.151s 00:05:54.874 sys 0m0.100s 00:05:54.874 19:12:41 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:54.874 19:12:41 -- common/autotest_common.sh@10 -- # set +x 00:05:54.874 ************************************ 00:05:54.874 END TEST event_reactor 00:05:54.874 ************************************ 00:05:54.874 19:12:41 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:54.874 19:12:41 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:54.874 19:12:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.874 19:12:41 -- common/autotest_common.sh@10 -- # set +x 00:05:55.132 ************************************ 00:05:55.132 START TEST event_reactor_perf 00:05:55.132 ************************************ 00:05:55.132 19:12:41 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:55.132 [2024-04-24 19:12:42.000453] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:55.132 [2024-04-24 19:12:42.000516] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606611 ] 00:05:55.132 EAL: No free 2048 kB hugepages reported on node 1 00:05:55.132 [2024-04-24 19:12:42.075866] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.390 [2024-04-24 19:12:42.159724] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.326 test_start 00:05:56.326 test_end 00:05:56.326 Performance: 920251 events per second 00:05:56.326 00:05:56.326 real 0m1.246s 00:05:56.326 user 0m1.146s 00:05:56.326 sys 0m0.095s 00:05:56.326 19:12:43 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:05:56.326 19:12:43 -- common/autotest_common.sh@10 -- # set +x 00:05:56.326 ************************************ 00:05:56.326 END TEST event_reactor_perf 00:05:56.326 ************************************ 00:05:56.326 19:12:43 -- event/event.sh@49 -- # uname -s 00:05:56.326 19:12:43 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:56.326 19:12:43 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:56.326 19:12:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:56.326 19:12:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.326 19:12:43 -- common/autotest_common.sh@10 -- # set +x 00:05:56.585 ************************************ 00:05:56.585 START TEST event_scheduler 00:05:56.585 ************************************ 00:05:56.585 19:12:43 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:56.585 * Looking for test storage... 00:05:56.585 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:56.585 19:12:43 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:56.585 19:12:43 -- scheduler/scheduler.sh@35 -- # scheduler_pid=1606842 00:05:56.585 19:12:43 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:56.585 19:12:43 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:56.585 19:12:43 -- scheduler/scheduler.sh@37 -- # waitforlisten 1606842 00:05:56.585 19:12:43 -- common/autotest_common.sh@817 -- # '[' -z 1606842 ']' 00:05:56.585 19:12:43 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:56.585 19:12:43 -- common/autotest_common.sh@822 -- # local max_retries=100 00:05:56.585 19:12:43 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:56.585 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:56.585 19:12:43 -- common/autotest_common.sh@826 -- # xtrace_disable 00:05:56.585 19:12:43 -- common/autotest_common.sh@10 -- # set +x 00:05:56.585 [2024-04-24 19:12:43.543921] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:05:56.585 [2024-04-24 19:12:43.543995] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606842 ] 00:05:56.585 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.844 [2024-04-24 19:12:43.615486] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:56.844 [2024-04-24 19:12:43.698789] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.844 [2024-04-24 19:12:43.698865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:56.844 [2024-04-24 19:12:43.698940] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:56.844 [2024-04-24 19:12:43.698942] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:57.411 19:12:44 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:05:57.411 19:12:44 -- common/autotest_common.sh@850 -- # return 0 00:05:57.411 19:12:44 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:57.411 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.411 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.411 POWER: Env isn't set yet! 00:05:57.411 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:57.412 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:57.412 POWER: Cannot set governor of lcore 0 to userspace 00:05:57.412 POWER: Attempting to initialise PSTAT power management... 00:05:57.412 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:57.412 POWER: Initialized successfully for lcore 0 power management 00:05:57.412 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:57.412 POWER: Initialized successfully for lcore 1 power management 00:05:57.670 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:57.670 POWER: Initialized successfully for lcore 2 power management 00:05:57.670 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:57.670 POWER: Initialized successfully for lcore 3 power management 00:05:57.670 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.670 19:12:44 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:57.670 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.670 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.670 [2024-04-24 19:12:44.519501] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:57.670 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.670 19:12:44 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:57.670 19:12:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:57.670 19:12:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:57.670 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.670 ************************************ 00:05:57.670 START TEST scheduler_create_thread 00:05:57.670 ************************************ 00:05:57.670 19:12:44 -- common/autotest_common.sh@1111 -- # scheduler_create_thread 00:05:57.670 19:12:44 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 2 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 3 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 4 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 5 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 6 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 7 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 8 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 9 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 10 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:57.929 19:12:44 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:57.929 19:12:44 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:57.929 19:12:44 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:57.929 19:12:44 -- common/autotest_common.sh@10 -- # set +x 00:05:58.863 19:12:45 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:05:58.863 19:12:45 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:58.863 19:12:45 -- common/autotest_common.sh@549 -- # xtrace_disable 00:05:58.863 19:12:45 -- common/autotest_common.sh@10 -- # set +x 00:06:00.239 19:12:47 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:00.239 19:12:47 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:00.239 19:12:47 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:00.239 19:12:47 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:00.239 19:12:47 -- common/autotest_common.sh@10 -- # set +x 00:06:01.173 19:12:48 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:01.173 00:06:01.173 real 0m3.483s 00:06:01.173 user 0m0.023s 00:06:01.173 sys 0m0.006s 00:06:01.173 19:12:48 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:01.173 19:12:48 -- common/autotest_common.sh@10 -- # set +x 00:06:01.173 ************************************ 00:06:01.173 END TEST scheduler_create_thread 00:06:01.173 ************************************ 00:06:01.431 19:12:48 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:01.431 19:12:48 -- scheduler/scheduler.sh@46 -- # killprocess 1606842 00:06:01.431 19:12:48 -- common/autotest_common.sh@936 -- # '[' -z 1606842 ']' 00:06:01.431 19:12:48 -- common/autotest_common.sh@940 -- # kill -0 1606842 00:06:01.431 19:12:48 -- common/autotest_common.sh@941 -- # uname 00:06:01.431 19:12:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:01.431 19:12:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1606842 00:06:01.431 19:12:48 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:01.431 19:12:48 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:01.431 19:12:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1606842' 00:06:01.431 killing process with pid 1606842 00:06:01.431 19:12:48 -- common/autotest_common.sh@955 -- # kill 1606842 00:06:01.431 19:12:48 -- common/autotest_common.sh@960 -- # wait 1606842 00:06:01.690 [2024-04-24 19:12:48.648779] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:01.690 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:01.690 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:01.690 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:01.690 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:01.690 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:01.690 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:01.690 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:01.690 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:01.948 00:06:01.948 real 0m5.435s 00:06:01.948 user 0m8.902s 00:06:01.948 sys 0m0.550s 00:06:01.948 19:12:48 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:01.948 19:12:48 -- common/autotest_common.sh@10 -- # set +x 00:06:01.948 ************************************ 00:06:01.948 END TEST event_scheduler 00:06:01.948 ************************************ 00:06:01.948 19:12:48 -- event/event.sh@51 -- # modprobe -n nbd 00:06:01.948 19:12:48 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:01.948 19:12:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:01.948 19:12:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:01.948 19:12:48 -- common/autotest_common.sh@10 -- # set +x 00:06:02.207 ************************************ 00:06:02.207 START TEST app_repeat 00:06:02.207 ************************************ 00:06:02.207 19:12:49 -- common/autotest_common.sh@1111 -- # app_repeat_test 00:06:02.207 19:12:49 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.207 19:12:49 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.207 19:12:49 -- event/event.sh@13 -- # local nbd_list 00:06:02.207 19:12:49 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.207 19:12:49 -- event/event.sh@14 -- # local bdev_list 00:06:02.207 19:12:49 -- event/event.sh@15 -- # local repeat_times=4 00:06:02.207 19:12:49 -- event/event.sh@17 -- # modprobe nbd 00:06:02.207 19:12:49 -- event/event.sh@19 -- # repeat_pid=1607612 00:06:02.207 19:12:49 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:02.207 19:12:49 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1607612' 00:06:02.207 Process app_repeat pid: 1607612 00:06:02.207 19:12:49 -- event/event.sh@23 -- # for i in {0..2} 00:06:02.207 19:12:49 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:02.207 spdk_app_start Round 0 00:06:02.207 19:12:49 -- event/event.sh@25 -- # waitforlisten 1607612 /var/tmp/spdk-nbd.sock 00:06:02.207 19:12:49 -- common/autotest_common.sh@817 -- # '[' -z 1607612 ']' 00:06:02.207 19:12:49 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:02.207 19:12:49 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:02.207 19:12:49 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:02.207 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:02.207 19:12:49 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:02.207 19:12:49 -- common/autotest_common.sh@10 -- # set +x 00:06:02.207 19:12:49 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:02.207 [2024-04-24 19:12:49.051116] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:02.207 [2024-04-24 19:12:49.051196] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1607612 ] 00:06:02.207 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.207 [2024-04-24 19:12:49.130561] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:02.466 [2024-04-24 19:12:49.223536] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:02.466 [2024-04-24 19:12:49.223539] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.032 19:12:49 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:03.032 19:12:49 -- common/autotest_common.sh@850 -- # return 0 00:06:03.032 19:12:49 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:03.032 Malloc0 00:06:03.290 19:12:50 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:03.290 Malloc1 00:06:03.290 19:12:50 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:03.290 19:12:50 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.290 19:12:50 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:03.290 19:12:50 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:03.290 19:12:50 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.290 19:12:50 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:03.290 19:12:50 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:03.290 19:12:50 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@12 -- # local i 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.291 19:12:50 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:03.549 /dev/nbd0 00:06:03.549 19:12:50 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:03.549 19:12:50 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:03.549 19:12:50 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:03.549 19:12:50 -- common/autotest_common.sh@855 -- # local i 00:06:03.549 19:12:50 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:03.549 19:12:50 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:03.549 19:12:50 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:03.549 19:12:50 -- common/autotest_common.sh@859 -- # break 00:06:03.549 19:12:50 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:03.549 19:12:50 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:03.549 19:12:50 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:03.549 1+0 records in 00:06:03.549 1+0 records out 00:06:03.549 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269944 s, 15.2 MB/s 00:06:03.549 19:12:50 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.549 19:12:50 -- common/autotest_common.sh@872 -- # size=4096 00:06:03.549 19:12:50 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.549 19:12:50 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:03.549 19:12:50 -- common/autotest_common.sh@875 -- # return 0 00:06:03.549 19:12:50 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:03.549 19:12:50 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.549 19:12:50 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:03.808 /dev/nbd1 00:06:03.808 19:12:50 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:03.808 19:12:50 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:03.808 19:12:50 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:03.808 19:12:50 -- common/autotest_common.sh@855 -- # local i 00:06:03.808 19:12:50 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:03.808 19:12:50 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:03.808 19:12:50 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:03.808 19:12:50 -- common/autotest_common.sh@859 -- # break 00:06:03.808 19:12:50 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:03.808 19:12:50 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:03.808 19:12:50 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:03.808 1+0 records in 00:06:03.808 1+0 records out 00:06:03.808 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00018719 s, 21.9 MB/s 00:06:03.808 19:12:50 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.808 19:12:50 -- common/autotest_common.sh@872 -- # size=4096 00:06:03.808 19:12:50 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:03.808 19:12:50 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:03.808 19:12:50 -- common/autotest_common.sh@875 -- # return 0 00:06:03.808 19:12:50 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:03.808 19:12:50 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:03.808 19:12:50 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:03.808 19:12:50 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.808 19:12:50 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:04.065 19:12:50 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:04.065 { 00:06:04.065 "nbd_device": "/dev/nbd0", 00:06:04.065 "bdev_name": "Malloc0" 00:06:04.065 }, 00:06:04.065 { 00:06:04.065 "nbd_device": "/dev/nbd1", 00:06:04.065 "bdev_name": "Malloc1" 00:06:04.065 } 00:06:04.065 ]' 00:06:04.065 19:12:50 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:04.065 { 00:06:04.065 "nbd_device": "/dev/nbd0", 00:06:04.065 "bdev_name": "Malloc0" 00:06:04.065 }, 00:06:04.065 { 00:06:04.065 "nbd_device": "/dev/nbd1", 00:06:04.065 "bdev_name": "Malloc1" 00:06:04.065 } 00:06:04.065 ]' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:04.066 /dev/nbd1' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:04.066 /dev/nbd1' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@65 -- # count=2 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@95 -- # count=2 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:04.066 256+0 records in 00:06:04.066 256+0 records out 00:06:04.066 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107826 s, 97.2 MB/s 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:04.066 256+0 records in 00:06:04.066 256+0 records out 00:06:04.066 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213381 s, 49.1 MB/s 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:04.066 256+0 records in 00:06:04.066 256+0 records out 00:06:04.066 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0224137 s, 46.8 MB/s 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:04.066 19:12:50 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@51 -- # local i 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:04.066 19:12:51 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@41 -- # break 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@45 -- # return 0 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:04.324 19:12:51 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:04.581 19:12:51 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:04.581 19:12:51 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:04.581 19:12:51 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:04.581 19:12:51 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:04.581 19:12:51 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@41 -- # break 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@45 -- # return 0 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:04.582 19:12:51 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@65 -- # true 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@65 -- # count=0 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@104 -- # count=0 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:04.889 19:12:51 -- bdev/nbd_common.sh@109 -- # return 0 00:06:04.889 19:12:51 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:04.889 19:12:51 -- event/event.sh@35 -- # sleep 3 00:06:05.147 [2024-04-24 19:12:52.025636] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:05.147 [2024-04-24 19:12:52.104879] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:05.147 [2024-04-24 19:12:52.104881] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.147 [2024-04-24 19:12:52.149068] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:05.147 [2024-04-24 19:12:52.149115] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:08.430 19:12:54 -- event/event.sh@23 -- # for i in {0..2} 00:06:08.430 19:12:54 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:08.430 spdk_app_start Round 1 00:06:08.430 19:12:54 -- event/event.sh@25 -- # waitforlisten 1607612 /var/tmp/spdk-nbd.sock 00:06:08.430 19:12:54 -- common/autotest_common.sh@817 -- # '[' -z 1607612 ']' 00:06:08.430 19:12:54 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:08.430 19:12:54 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:08.430 19:12:54 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:08.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:08.430 19:12:54 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:08.430 19:12:54 -- common/autotest_common.sh@10 -- # set +x 00:06:08.430 19:12:55 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:08.430 19:12:55 -- common/autotest_common.sh@850 -- # return 0 00:06:08.430 19:12:55 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:08.430 Malloc0 00:06:08.430 19:12:55 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:08.430 Malloc1 00:06:08.430 19:12:55 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@12 -- # local i 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:08.430 19:12:55 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:08.688 /dev/nbd0 00:06:08.688 19:12:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:08.688 19:12:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:08.688 19:12:55 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:08.688 19:12:55 -- common/autotest_common.sh@855 -- # local i 00:06:08.688 19:12:55 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:08.688 19:12:55 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:08.688 19:12:55 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:08.688 19:12:55 -- common/autotest_common.sh@859 -- # break 00:06:08.688 19:12:55 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:08.688 19:12:55 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:08.688 19:12:55 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:08.688 1+0 records in 00:06:08.688 1+0 records out 00:06:08.688 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250913 s, 16.3 MB/s 00:06:08.688 19:12:55 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:08.688 19:12:55 -- common/autotest_common.sh@872 -- # size=4096 00:06:08.688 19:12:55 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:08.688 19:12:55 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:08.688 19:12:55 -- common/autotest_common.sh@875 -- # return 0 00:06:08.688 19:12:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:08.688 19:12:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:08.688 19:12:55 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:08.948 /dev/nbd1 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:08.948 19:12:55 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:08.948 19:12:55 -- common/autotest_common.sh@855 -- # local i 00:06:08.948 19:12:55 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:08.948 19:12:55 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:08.948 19:12:55 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:08.948 19:12:55 -- common/autotest_common.sh@859 -- # break 00:06:08.948 19:12:55 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:08.948 19:12:55 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:08.948 19:12:55 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:08.948 1+0 records in 00:06:08.948 1+0 records out 00:06:08.948 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226718 s, 18.1 MB/s 00:06:08.948 19:12:55 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:08.948 19:12:55 -- common/autotest_common.sh@872 -- # size=4096 00:06:08.948 19:12:55 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:08.948 19:12:55 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:08.948 19:12:55 -- common/autotest_common.sh@875 -- # return 0 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:08.948 { 00:06:08.948 "nbd_device": "/dev/nbd0", 00:06:08.948 "bdev_name": "Malloc0" 00:06:08.948 }, 00:06:08.948 { 00:06:08.948 "nbd_device": "/dev/nbd1", 00:06:08.948 "bdev_name": "Malloc1" 00:06:08.948 } 00:06:08.948 ]' 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:08.948 { 00:06:08.948 "nbd_device": "/dev/nbd0", 00:06:08.948 "bdev_name": "Malloc0" 00:06:08.948 }, 00:06:08.948 { 00:06:08.948 "nbd_device": "/dev/nbd1", 00:06:08.948 "bdev_name": "Malloc1" 00:06:08.948 } 00:06:08.948 ]' 00:06:08.948 19:12:55 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:09.207 /dev/nbd1' 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:09.207 /dev/nbd1' 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@65 -- # count=2 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@95 -- # count=2 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:09.207 19:12:55 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:09.207 256+0 records in 00:06:09.207 256+0 records out 00:06:09.207 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010399 s, 101 MB/s 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:09.207 256+0 records in 00:06:09.207 256+0 records out 00:06:09.207 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209688 s, 50.0 MB/s 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:09.207 256+0 records in 00:06:09.207 256+0 records out 00:06:09.207 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0225328 s, 46.5 MB/s 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@51 -- # local i 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:09.207 19:12:56 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@41 -- # break 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@45 -- # return 0 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:09.466 19:12:56 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@41 -- # break 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@45 -- # return 0 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@65 -- # true 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@65 -- # count=0 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@104 -- # count=0 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:09.724 19:12:56 -- bdev/nbd_common.sh@109 -- # return 0 00:06:09.724 19:12:56 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:09.982 19:12:56 -- event/event.sh@35 -- # sleep 3 00:06:10.240 [2024-04-24 19:12:57.099660] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:10.240 [2024-04-24 19:12:57.179962] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:10.240 [2024-04-24 19:12:57.179965] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.240 [2024-04-24 19:12:57.227447] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:10.240 [2024-04-24 19:12:57.227493] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:13.524 19:12:59 -- event/event.sh@23 -- # for i in {0..2} 00:06:13.524 19:12:59 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:13.524 spdk_app_start Round 2 00:06:13.524 19:12:59 -- event/event.sh@25 -- # waitforlisten 1607612 /var/tmp/spdk-nbd.sock 00:06:13.524 19:12:59 -- common/autotest_common.sh@817 -- # '[' -z 1607612 ']' 00:06:13.524 19:12:59 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:13.524 19:12:59 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:13.524 19:12:59 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:13.524 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:13.524 19:12:59 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:13.524 19:12:59 -- common/autotest_common.sh@10 -- # set +x 00:06:13.524 19:13:00 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:13.524 19:13:00 -- common/autotest_common.sh@850 -- # return 0 00:06:13.524 19:13:00 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:13.524 Malloc0 00:06:13.524 19:13:00 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:13.524 Malloc1 00:06:13.524 19:13:00 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@12 -- # local i 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:13.524 19:13:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:13.784 /dev/nbd0 00:06:13.784 19:13:00 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:13.784 19:13:00 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:13.784 19:13:00 -- common/autotest_common.sh@854 -- # local nbd_name=nbd0 00:06:13.784 19:13:00 -- common/autotest_common.sh@855 -- # local i 00:06:13.784 19:13:00 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:13.784 19:13:00 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:13.784 19:13:00 -- common/autotest_common.sh@858 -- # grep -q -w nbd0 /proc/partitions 00:06:13.784 19:13:00 -- common/autotest_common.sh@859 -- # break 00:06:13.784 19:13:00 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:13.784 19:13:00 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:13.784 19:13:00 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:13.784 1+0 records in 00:06:13.784 1+0 records out 00:06:13.784 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232681 s, 17.6 MB/s 00:06:13.784 19:13:00 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:13.784 19:13:00 -- common/autotest_common.sh@872 -- # size=4096 00:06:13.784 19:13:00 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:13.784 19:13:00 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:13.784 19:13:00 -- common/autotest_common.sh@875 -- # return 0 00:06:13.784 19:13:00 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:13.784 19:13:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:13.784 19:13:00 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:14.044 /dev/nbd1 00:06:14.044 19:13:00 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:14.044 19:13:00 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:14.044 19:13:00 -- common/autotest_common.sh@854 -- # local nbd_name=nbd1 00:06:14.044 19:13:00 -- common/autotest_common.sh@855 -- # local i 00:06:14.044 19:13:00 -- common/autotest_common.sh@857 -- # (( i = 1 )) 00:06:14.044 19:13:00 -- common/autotest_common.sh@857 -- # (( i <= 20 )) 00:06:14.044 19:13:00 -- common/autotest_common.sh@858 -- # grep -q -w nbd1 /proc/partitions 00:06:14.044 19:13:00 -- common/autotest_common.sh@859 -- # break 00:06:14.044 19:13:00 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:14.044 19:13:00 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:14.044 19:13:00 -- common/autotest_common.sh@871 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:14.044 1+0 records in 00:06:14.044 1+0 records out 00:06:14.044 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000188741 s, 21.7 MB/s 00:06:14.044 19:13:00 -- common/autotest_common.sh@872 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:14.044 19:13:00 -- common/autotest_common.sh@872 -- # size=4096 00:06:14.044 19:13:00 -- common/autotest_common.sh@873 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:14.044 19:13:00 -- common/autotest_common.sh@874 -- # '[' 4096 '!=' 0 ']' 00:06:14.044 19:13:00 -- common/autotest_common.sh@875 -- # return 0 00:06:14.044 19:13:00 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:14.044 19:13:00 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:14.044 19:13:00 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:14.044 19:13:00 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.044 19:13:00 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:14.044 19:13:01 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:14.044 { 00:06:14.044 "nbd_device": "/dev/nbd0", 00:06:14.044 "bdev_name": "Malloc0" 00:06:14.044 }, 00:06:14.044 { 00:06:14.044 "nbd_device": "/dev/nbd1", 00:06:14.044 "bdev_name": "Malloc1" 00:06:14.044 } 00:06:14.044 ]' 00:06:14.044 19:13:01 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:14.044 { 00:06:14.044 "nbd_device": "/dev/nbd0", 00:06:14.044 "bdev_name": "Malloc0" 00:06:14.044 }, 00:06:14.044 { 00:06:14.044 "nbd_device": "/dev/nbd1", 00:06:14.044 "bdev_name": "Malloc1" 00:06:14.044 } 00:06:14.044 ]' 00:06:14.044 19:13:01 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:14.304 /dev/nbd1' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:14.304 /dev/nbd1' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@65 -- # count=2 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@95 -- # count=2 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:14.304 256+0 records in 00:06:14.304 256+0 records out 00:06:14.304 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113721 s, 92.2 MB/s 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:14.304 256+0 records in 00:06:14.304 256+0 records out 00:06:14.304 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211318 s, 49.6 MB/s 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:14.304 256+0 records in 00:06:14.304 256+0 records out 00:06:14.304 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0224541 s, 46.7 MB/s 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@51 -- # local i 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:14.304 19:13:01 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@41 -- # break 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@45 -- # return 0 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:14.564 19:13:01 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@41 -- # break 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@45 -- # return 0 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@65 -- # true 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@65 -- # count=0 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@104 -- # count=0 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:14.824 19:13:01 -- bdev/nbd_common.sh@109 -- # return 0 00:06:14.824 19:13:01 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:15.082 19:13:01 -- event/event.sh@35 -- # sleep 3 00:06:15.341 [2024-04-24 19:13:02.188896] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:15.341 [2024-04-24 19:13:02.269543] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:15.341 [2024-04-24 19:13:02.269547] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.341 [2024-04-24 19:13:02.315971] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:15.341 [2024-04-24 19:13:02.316019] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:18.626 19:13:04 -- event/event.sh@38 -- # waitforlisten 1607612 /var/tmp/spdk-nbd.sock 00:06:18.626 19:13:04 -- common/autotest_common.sh@817 -- # '[' -z 1607612 ']' 00:06:18.626 19:13:04 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:18.626 19:13:04 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:18.626 19:13:04 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:18.626 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:18.626 19:13:04 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:18.626 19:13:04 -- common/autotest_common.sh@10 -- # set +x 00:06:18.626 19:13:05 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:18.626 19:13:05 -- common/autotest_common.sh@850 -- # return 0 00:06:18.626 19:13:05 -- event/event.sh@39 -- # killprocess 1607612 00:06:18.626 19:13:05 -- common/autotest_common.sh@936 -- # '[' -z 1607612 ']' 00:06:18.626 19:13:05 -- common/autotest_common.sh@940 -- # kill -0 1607612 00:06:18.626 19:13:05 -- common/autotest_common.sh@941 -- # uname 00:06:18.626 19:13:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.626 19:13:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1607612 00:06:18.626 19:13:05 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:18.626 19:13:05 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:18.626 19:13:05 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1607612' 00:06:18.626 killing process with pid 1607612 00:06:18.626 19:13:05 -- common/autotest_common.sh@955 -- # kill 1607612 00:06:18.626 19:13:05 -- common/autotest_common.sh@960 -- # wait 1607612 00:06:18.626 spdk_app_start is called in Round 0. 00:06:18.626 Shutdown signal received, stop current app iteration 00:06:18.626 Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 reinitialization... 00:06:18.626 spdk_app_start is called in Round 1. 00:06:18.626 Shutdown signal received, stop current app iteration 00:06:18.626 Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 reinitialization... 00:06:18.626 spdk_app_start is called in Round 2. 00:06:18.626 Shutdown signal received, stop current app iteration 00:06:18.626 Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 reinitialization... 00:06:18.626 spdk_app_start is called in Round 3. 00:06:18.626 Shutdown signal received, stop current app iteration 00:06:18.626 19:13:05 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:18.626 19:13:05 -- event/event.sh@42 -- # return 0 00:06:18.626 00:06:18.626 real 0m16.351s 00:06:18.626 user 0m34.410s 00:06:18.626 sys 0m3.299s 00:06:18.626 19:13:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:18.626 19:13:05 -- common/autotest_common.sh@10 -- # set +x 00:06:18.626 ************************************ 00:06:18.626 END TEST app_repeat 00:06:18.626 ************************************ 00:06:18.626 19:13:05 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:18.626 19:13:05 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:18.626 19:13:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.626 19:13:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.626 19:13:05 -- common/autotest_common.sh@10 -- # set +x 00:06:18.626 ************************************ 00:06:18.626 START TEST cpu_locks 00:06:18.626 ************************************ 00:06:18.626 19:13:05 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:18.626 * Looking for test storage... 00:06:18.885 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:18.885 19:13:05 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:18.885 19:13:05 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:18.885 19:13:05 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:18.885 19:13:05 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:18.885 19:13:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.885 19:13:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.885 19:13:05 -- common/autotest_common.sh@10 -- # set +x 00:06:18.885 ************************************ 00:06:18.885 START TEST default_locks 00:06:18.885 ************************************ 00:06:18.885 19:13:05 -- common/autotest_common.sh@1111 -- # default_locks 00:06:18.885 19:13:05 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1610123 00:06:18.885 19:13:05 -- event/cpu_locks.sh@47 -- # waitforlisten 1610123 00:06:18.885 19:13:05 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:18.885 19:13:05 -- common/autotest_common.sh@817 -- # '[' -z 1610123 ']' 00:06:18.885 19:13:05 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:18.885 19:13:05 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:18.885 19:13:05 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:18.885 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:18.885 19:13:05 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:18.885 19:13:05 -- common/autotest_common.sh@10 -- # set +x 00:06:18.885 [2024-04-24 19:13:05.827000] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:18.885 [2024-04-24 19:13:05.827071] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1610123 ] 00:06:18.885 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.143 [2024-04-24 19:13:05.904896] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.143 [2024-04-24 19:13:05.986485] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.709 19:13:06 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:19.709 19:13:06 -- common/autotest_common.sh@850 -- # return 0 00:06:19.709 19:13:06 -- event/cpu_locks.sh@49 -- # locks_exist 1610123 00:06:19.709 19:13:06 -- event/cpu_locks.sh@22 -- # lslocks -p 1610123 00:06:19.709 19:13:06 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:20.279 lslocks: write error 00:06:20.279 19:13:07 -- event/cpu_locks.sh@50 -- # killprocess 1610123 00:06:20.279 19:13:07 -- common/autotest_common.sh@936 -- # '[' -z 1610123 ']' 00:06:20.279 19:13:07 -- common/autotest_common.sh@940 -- # kill -0 1610123 00:06:20.279 19:13:07 -- common/autotest_common.sh@941 -- # uname 00:06:20.279 19:13:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:20.279 19:13:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1610123 00:06:20.539 19:13:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:20.539 19:13:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:20.539 19:13:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1610123' 00:06:20.539 killing process with pid 1610123 00:06:20.539 19:13:07 -- common/autotest_common.sh@955 -- # kill 1610123 00:06:20.539 19:13:07 -- common/autotest_common.sh@960 -- # wait 1610123 00:06:20.799 19:13:07 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1610123 00:06:20.799 19:13:07 -- common/autotest_common.sh@638 -- # local es=0 00:06:20.799 19:13:07 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 1610123 00:06:20.799 19:13:07 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:06:20.799 19:13:07 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:20.799 19:13:07 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:06:20.799 19:13:07 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:20.799 19:13:07 -- common/autotest_common.sh@641 -- # waitforlisten 1610123 00:06:20.799 19:13:07 -- common/autotest_common.sh@817 -- # '[' -z 1610123 ']' 00:06:20.799 19:13:07 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:20.799 19:13:07 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:20.799 19:13:07 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:20.799 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:20.799 19:13:07 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:20.799 19:13:07 -- common/autotest_common.sh@10 -- # set +x 00:06:20.799 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (1610123) - No such process 00:06:20.799 ERROR: process (pid: 1610123) is no longer running 00:06:20.799 19:13:07 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:20.799 19:13:07 -- common/autotest_common.sh@850 -- # return 1 00:06:20.799 19:13:07 -- common/autotest_common.sh@641 -- # es=1 00:06:20.799 19:13:07 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:20.799 19:13:07 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:20.799 19:13:07 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:20.799 19:13:07 -- event/cpu_locks.sh@54 -- # no_locks 00:06:20.799 19:13:07 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:20.799 19:13:07 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:20.799 19:13:07 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:20.799 00:06:20.799 real 0m1.849s 00:06:20.799 user 0m1.917s 00:06:20.799 sys 0m0.676s 00:06:20.799 19:13:07 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:20.799 19:13:07 -- common/autotest_common.sh@10 -- # set +x 00:06:20.799 ************************************ 00:06:20.799 END TEST default_locks 00:06:20.799 ************************************ 00:06:20.799 19:13:07 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:20.799 19:13:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:20.799 19:13:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:20.799 19:13:07 -- common/autotest_common.sh@10 -- # set +x 00:06:21.058 ************************************ 00:06:21.058 START TEST default_locks_via_rpc 00:06:21.058 ************************************ 00:06:21.058 19:13:07 -- common/autotest_common.sh@1111 -- # default_locks_via_rpc 00:06:21.058 19:13:07 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1610350 00:06:21.058 19:13:07 -- event/cpu_locks.sh@63 -- # waitforlisten 1610350 00:06:21.058 19:13:07 -- common/autotest_common.sh@817 -- # '[' -z 1610350 ']' 00:06:21.058 19:13:07 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:21.058 19:13:07 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:21.058 19:13:07 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:21.058 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:21.058 19:13:07 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:21.058 19:13:07 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:21.058 19:13:07 -- common/autotest_common.sh@10 -- # set +x 00:06:21.058 [2024-04-24 19:13:07.861540] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:21.058 [2024-04-24 19:13:07.861618] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1610350 ] 00:06:21.058 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.058 [2024-04-24 19:13:07.936279] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.058 [2024-04-24 19:13:08.028822] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.082 19:13:08 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:22.082 19:13:08 -- common/autotest_common.sh@850 -- # return 0 00:06:22.082 19:13:08 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:22.082 19:13:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:22.082 19:13:08 -- common/autotest_common.sh@10 -- # set +x 00:06:22.082 19:13:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:22.082 19:13:08 -- event/cpu_locks.sh@67 -- # no_locks 00:06:22.082 19:13:08 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:22.082 19:13:08 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:22.082 19:13:08 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:22.082 19:13:08 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:22.082 19:13:08 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:22.082 19:13:08 -- common/autotest_common.sh@10 -- # set +x 00:06:22.082 19:13:08 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:22.082 19:13:08 -- event/cpu_locks.sh@71 -- # locks_exist 1610350 00:06:22.082 19:13:08 -- event/cpu_locks.sh@22 -- # lslocks -p 1610350 00:06:22.082 19:13:08 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:22.082 19:13:09 -- event/cpu_locks.sh@73 -- # killprocess 1610350 00:06:22.082 19:13:09 -- common/autotest_common.sh@936 -- # '[' -z 1610350 ']' 00:06:22.082 19:13:09 -- common/autotest_common.sh@940 -- # kill -0 1610350 00:06:22.082 19:13:09 -- common/autotest_common.sh@941 -- # uname 00:06:22.082 19:13:09 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:22.082 19:13:09 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1610350 00:06:22.378 19:13:09 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:22.378 19:13:09 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:22.378 19:13:09 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1610350' 00:06:22.378 killing process with pid 1610350 00:06:22.378 19:13:09 -- common/autotest_common.sh@955 -- # kill 1610350 00:06:22.378 19:13:09 -- common/autotest_common.sh@960 -- # wait 1610350 00:06:22.637 00:06:22.637 real 0m1.565s 00:06:22.637 user 0m1.612s 00:06:22.637 sys 0m0.545s 00:06:22.637 19:13:09 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:22.637 19:13:09 -- common/autotest_common.sh@10 -- # set +x 00:06:22.637 ************************************ 00:06:22.637 END TEST default_locks_via_rpc 00:06:22.637 ************************************ 00:06:22.638 19:13:09 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:22.638 19:13:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:22.638 19:13:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:22.638 19:13:09 -- common/autotest_common.sh@10 -- # set +x 00:06:22.638 ************************************ 00:06:22.638 START TEST non_locking_app_on_locked_coremask 00:06:22.638 ************************************ 00:06:22.638 19:13:09 -- common/autotest_common.sh@1111 -- # non_locking_app_on_locked_coremask 00:06:22.638 19:13:09 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1610706 00:06:22.638 19:13:09 -- event/cpu_locks.sh@81 -- # waitforlisten 1610706 /var/tmp/spdk.sock 00:06:22.638 19:13:09 -- common/autotest_common.sh@817 -- # '[' -z 1610706 ']' 00:06:22.638 19:13:09 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:22.638 19:13:09 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:22.638 19:13:09 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:22.638 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:22.638 19:13:09 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:22.638 19:13:09 -- common/autotest_common.sh@10 -- # set +x 00:06:22.638 19:13:09 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:22.638 [2024-04-24 19:13:09.613549] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:22.638 [2024-04-24 19:13:09.613619] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1610706 ] 00:06:22.638 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.897 [2024-04-24 19:13:09.690421] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.897 [2024-04-24 19:13:09.782905] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.464 19:13:10 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:23.464 19:13:10 -- common/autotest_common.sh@850 -- # return 0 00:06:23.464 19:13:10 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1610746 00:06:23.464 19:13:10 -- event/cpu_locks.sh@85 -- # waitforlisten 1610746 /var/tmp/spdk2.sock 00:06:23.464 19:13:10 -- common/autotest_common.sh@817 -- # '[' -z 1610746 ']' 00:06:23.464 19:13:10 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:23.464 19:13:10 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:23.464 19:13:10 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:23.464 19:13:10 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:23.464 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:23.464 19:13:10 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:23.464 19:13:10 -- common/autotest_common.sh@10 -- # set +x 00:06:23.464 [2024-04-24 19:13:10.452748] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:23.464 [2024-04-24 19:13:10.452819] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1610746 ] 00:06:23.724 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.724 [2024-04-24 19:13:10.553410] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:23.724 [2024-04-24 19:13:10.553435] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.724 [2024-04-24 19:13:10.720347] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.292 19:13:11 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:24.292 19:13:11 -- common/autotest_common.sh@850 -- # return 0 00:06:24.292 19:13:11 -- event/cpu_locks.sh@87 -- # locks_exist 1610706 00:06:24.292 19:13:11 -- event/cpu_locks.sh@22 -- # lslocks -p 1610706 00:06:24.292 19:13:11 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:25.669 lslocks: write error 00:06:25.669 19:13:12 -- event/cpu_locks.sh@89 -- # killprocess 1610706 00:06:25.669 19:13:12 -- common/autotest_common.sh@936 -- # '[' -z 1610706 ']' 00:06:25.669 19:13:12 -- common/autotest_common.sh@940 -- # kill -0 1610706 00:06:25.669 19:13:12 -- common/autotest_common.sh@941 -- # uname 00:06:25.669 19:13:12 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:25.669 19:13:12 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1610706 00:06:25.669 19:13:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:25.669 19:13:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:25.669 19:13:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1610706' 00:06:25.669 killing process with pid 1610706 00:06:25.669 19:13:12 -- common/autotest_common.sh@955 -- # kill 1610706 00:06:25.669 19:13:12 -- common/autotest_common.sh@960 -- # wait 1610706 00:06:26.237 19:13:13 -- event/cpu_locks.sh@90 -- # killprocess 1610746 00:06:26.237 19:13:13 -- common/autotest_common.sh@936 -- # '[' -z 1610746 ']' 00:06:26.237 19:13:13 -- common/autotest_common.sh@940 -- # kill -0 1610746 00:06:26.237 19:13:13 -- common/autotest_common.sh@941 -- # uname 00:06:26.237 19:13:13 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:26.237 19:13:13 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1610746 00:06:26.237 19:13:13 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:26.237 19:13:13 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:26.237 19:13:13 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1610746' 00:06:26.237 killing process with pid 1610746 00:06:26.237 19:13:13 -- common/autotest_common.sh@955 -- # kill 1610746 00:06:26.237 19:13:13 -- common/autotest_common.sh@960 -- # wait 1610746 00:06:26.503 00:06:26.503 real 0m3.813s 00:06:26.503 user 0m3.972s 00:06:26.503 sys 0m1.294s 00:06:26.503 19:13:13 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:26.503 19:13:13 -- common/autotest_common.sh@10 -- # set +x 00:06:26.503 ************************************ 00:06:26.503 END TEST non_locking_app_on_locked_coremask 00:06:26.503 ************************************ 00:06:26.503 19:13:13 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:26.503 19:13:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:26.503 19:13:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:26.503 19:13:13 -- common/autotest_common.sh@10 -- # set +x 00:06:26.766 ************************************ 00:06:26.766 START TEST locking_app_on_unlocked_coremask 00:06:26.766 ************************************ 00:06:26.766 19:13:13 -- common/autotest_common.sh@1111 -- # locking_app_on_unlocked_coremask 00:06:26.766 19:13:13 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1611265 00:06:26.766 19:13:13 -- event/cpu_locks.sh@99 -- # waitforlisten 1611265 /var/tmp/spdk.sock 00:06:26.766 19:13:13 -- common/autotest_common.sh@817 -- # '[' -z 1611265 ']' 00:06:26.766 19:13:13 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:26.766 19:13:13 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:26.766 19:13:13 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:26.766 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:26.766 19:13:13 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:26.766 19:13:13 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:26.766 19:13:13 -- common/autotest_common.sh@10 -- # set +x 00:06:26.766 [2024-04-24 19:13:13.611233] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:26.766 [2024-04-24 19:13:13.611308] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611265 ] 00:06:26.766 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.766 [2024-04-24 19:13:13.685197] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:26.766 [2024-04-24 19:13:13.685227] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.766 [2024-04-24 19:13:13.774774] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.702 19:13:14 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:27.702 19:13:14 -- common/autotest_common.sh@850 -- # return 0 00:06:27.702 19:13:14 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1611317 00:06:27.702 19:13:14 -- event/cpu_locks.sh@103 -- # waitforlisten 1611317 /var/tmp/spdk2.sock 00:06:27.702 19:13:14 -- common/autotest_common.sh@817 -- # '[' -z 1611317 ']' 00:06:27.702 19:13:14 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:27.702 19:13:14 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:27.702 19:13:14 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:27.702 19:13:14 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:27.702 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:27.702 19:13:14 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:27.702 19:13:14 -- common/autotest_common.sh@10 -- # set +x 00:06:27.702 [2024-04-24 19:13:14.447358] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:27.702 [2024-04-24 19:13:14.447432] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611317 ] 00:06:27.702 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.702 [2024-04-24 19:13:14.544120] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.702 [2024-04-24 19:13:14.705156] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.270 19:13:15 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:28.270 19:13:15 -- common/autotest_common.sh@850 -- # return 0 00:06:28.270 19:13:15 -- event/cpu_locks.sh@105 -- # locks_exist 1611317 00:06:28.270 19:13:15 -- event/cpu_locks.sh@22 -- # lslocks -p 1611317 00:06:28.270 19:13:15 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:28.835 lslocks: write error 00:06:28.835 19:13:15 -- event/cpu_locks.sh@107 -- # killprocess 1611265 00:06:28.835 19:13:15 -- common/autotest_common.sh@936 -- # '[' -z 1611265 ']' 00:06:28.835 19:13:15 -- common/autotest_common.sh@940 -- # kill -0 1611265 00:06:28.836 19:13:15 -- common/autotest_common.sh@941 -- # uname 00:06:28.836 19:13:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:28.836 19:13:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1611265 00:06:29.094 19:13:15 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:29.094 19:13:15 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:29.094 19:13:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1611265' 00:06:29.094 killing process with pid 1611265 00:06:29.094 19:13:15 -- common/autotest_common.sh@955 -- # kill 1611265 00:06:29.094 19:13:15 -- common/autotest_common.sh@960 -- # wait 1611265 00:06:29.661 19:13:16 -- event/cpu_locks.sh@108 -- # killprocess 1611317 00:06:29.661 19:13:16 -- common/autotest_common.sh@936 -- # '[' -z 1611317 ']' 00:06:29.661 19:13:16 -- common/autotest_common.sh@940 -- # kill -0 1611317 00:06:29.661 19:13:16 -- common/autotest_common.sh@941 -- # uname 00:06:29.661 19:13:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:29.661 19:13:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1611317 00:06:29.661 19:13:16 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:29.661 19:13:16 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:29.661 19:13:16 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1611317' 00:06:29.661 killing process with pid 1611317 00:06:29.661 19:13:16 -- common/autotest_common.sh@955 -- # kill 1611317 00:06:29.661 19:13:16 -- common/autotest_common.sh@960 -- # wait 1611317 00:06:29.920 00:06:29.920 real 0m3.322s 00:06:29.920 user 0m3.449s 00:06:29.920 sys 0m1.047s 00:06:29.920 19:13:16 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:29.920 19:13:16 -- common/autotest_common.sh@10 -- # set +x 00:06:29.920 ************************************ 00:06:29.920 END TEST locking_app_on_unlocked_coremask 00:06:29.920 ************************************ 00:06:30.178 19:13:16 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:30.178 19:13:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:30.178 19:13:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:30.178 19:13:16 -- common/autotest_common.sh@10 -- # set +x 00:06:30.178 ************************************ 00:06:30.178 START TEST locking_app_on_locked_coremask 00:06:30.178 ************************************ 00:06:30.178 19:13:17 -- common/autotest_common.sh@1111 -- # locking_app_on_locked_coremask 00:06:30.178 19:13:17 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1611706 00:06:30.178 19:13:17 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:30.178 19:13:17 -- event/cpu_locks.sh@116 -- # waitforlisten 1611706 /var/tmp/spdk.sock 00:06:30.178 19:13:17 -- common/autotest_common.sh@817 -- # '[' -z 1611706 ']' 00:06:30.178 19:13:17 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.178 19:13:17 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:30.178 19:13:17 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.178 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.178 19:13:17 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:30.178 19:13:17 -- common/autotest_common.sh@10 -- # set +x 00:06:30.178 [2024-04-24 19:13:17.128485] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:30.178 [2024-04-24 19:13:17.128570] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611706 ] 00:06:30.178 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.437 [2024-04-24 19:13:17.206086] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.437 [2024-04-24 19:13:17.293311] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.003 19:13:17 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:31.003 19:13:17 -- common/autotest_common.sh@850 -- # return 0 00:06:31.003 19:13:17 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1611880 00:06:31.003 19:13:17 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1611880 /var/tmp/spdk2.sock 00:06:31.003 19:13:17 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:31.003 19:13:17 -- common/autotest_common.sh@638 -- # local es=0 00:06:31.003 19:13:17 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 1611880 /var/tmp/spdk2.sock 00:06:31.003 19:13:17 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:06:31.003 19:13:17 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:31.003 19:13:17 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:06:31.003 19:13:17 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:31.003 19:13:17 -- common/autotest_common.sh@641 -- # waitforlisten 1611880 /var/tmp/spdk2.sock 00:06:31.003 19:13:17 -- common/autotest_common.sh@817 -- # '[' -z 1611880 ']' 00:06:31.003 19:13:17 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:31.003 19:13:17 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:31.003 19:13:17 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:31.003 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:31.003 19:13:17 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:31.003 19:13:17 -- common/autotest_common.sh@10 -- # set +x 00:06:31.003 [2024-04-24 19:13:17.974418] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:31.003 [2024-04-24 19:13:17.974502] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611880 ] 00:06:31.003 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.261 [2024-04-24 19:13:18.076517] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1611706 has claimed it. 00:06:31.261 [2024-04-24 19:13:18.076556] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:31.827 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (1611880) - No such process 00:06:31.828 ERROR: process (pid: 1611880) is no longer running 00:06:31.828 19:13:18 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:31.828 19:13:18 -- common/autotest_common.sh@850 -- # return 1 00:06:31.828 19:13:18 -- common/autotest_common.sh@641 -- # es=1 00:06:31.828 19:13:18 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:31.828 19:13:18 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:31.828 19:13:18 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:31.828 19:13:18 -- event/cpu_locks.sh@122 -- # locks_exist 1611706 00:06:31.828 19:13:18 -- event/cpu_locks.sh@22 -- # lslocks -p 1611706 00:06:31.828 19:13:18 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:32.086 lslocks: write error 00:06:32.086 19:13:19 -- event/cpu_locks.sh@124 -- # killprocess 1611706 00:06:32.086 19:13:19 -- common/autotest_common.sh@936 -- # '[' -z 1611706 ']' 00:06:32.086 19:13:19 -- common/autotest_common.sh@940 -- # kill -0 1611706 00:06:32.086 19:13:19 -- common/autotest_common.sh@941 -- # uname 00:06:32.086 19:13:19 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:32.086 19:13:19 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1611706 00:06:32.086 19:13:19 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:32.086 19:13:19 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:32.086 19:13:19 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1611706' 00:06:32.086 killing process with pid 1611706 00:06:32.086 19:13:19 -- common/autotest_common.sh@955 -- # kill 1611706 00:06:32.086 19:13:19 -- common/autotest_common.sh@960 -- # wait 1611706 00:06:32.654 00:06:32.654 real 0m2.320s 00:06:32.654 user 0m2.493s 00:06:32.654 sys 0m0.696s 00:06:32.654 19:13:19 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:32.654 19:13:19 -- common/autotest_common.sh@10 -- # set +x 00:06:32.654 ************************************ 00:06:32.654 END TEST locking_app_on_locked_coremask 00:06:32.654 ************************************ 00:06:32.654 19:13:19 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:32.654 19:13:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:32.654 19:13:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.654 19:13:19 -- common/autotest_common.sh@10 -- # set +x 00:06:32.654 ************************************ 00:06:32.654 START TEST locking_overlapped_coremask 00:06:32.654 ************************************ 00:06:32.654 19:13:19 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask 00:06:32.654 19:13:19 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1612099 00:06:32.654 19:13:19 -- event/cpu_locks.sh@133 -- # waitforlisten 1612099 /var/tmp/spdk.sock 00:06:32.654 19:13:19 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:32.654 19:13:19 -- common/autotest_common.sh@817 -- # '[' -z 1612099 ']' 00:06:32.654 19:13:19 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:32.654 19:13:19 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:32.654 19:13:19 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:32.654 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:32.654 19:13:19 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:32.654 19:13:19 -- common/autotest_common.sh@10 -- # set +x 00:06:32.654 [2024-04-24 19:13:19.653902] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:32.654 [2024-04-24 19:13:19.653967] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612099 ] 00:06:32.912 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.912 [2024-04-24 19:13:19.733677] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:32.912 [2024-04-24 19:13:19.825620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:32.912 [2024-04-24 19:13:19.825643] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:32.912 [2024-04-24 19:13:19.825646] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.480 19:13:20 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:33.480 19:13:20 -- common/autotest_common.sh@850 -- # return 0 00:06:33.480 19:13:20 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1612277 00:06:33.480 19:13:20 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1612277 /var/tmp/spdk2.sock 00:06:33.480 19:13:20 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:33.480 19:13:20 -- common/autotest_common.sh@638 -- # local es=0 00:06:33.480 19:13:20 -- common/autotest_common.sh@640 -- # valid_exec_arg waitforlisten 1612277 /var/tmp/spdk2.sock 00:06:33.480 19:13:20 -- common/autotest_common.sh@626 -- # local arg=waitforlisten 00:06:33.480 19:13:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:33.480 19:13:20 -- common/autotest_common.sh@630 -- # type -t waitforlisten 00:06:33.480 19:13:20 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:33.480 19:13:20 -- common/autotest_common.sh@641 -- # waitforlisten 1612277 /var/tmp/spdk2.sock 00:06:33.480 19:13:20 -- common/autotest_common.sh@817 -- # '[' -z 1612277 ']' 00:06:33.480 19:13:20 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:33.480 19:13:20 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:33.480 19:13:20 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:33.481 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:33.481 19:13:20 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:33.481 19:13:20 -- common/autotest_common.sh@10 -- # set +x 00:06:33.739 [2024-04-24 19:13:20.509849] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:33.739 [2024-04-24 19:13:20.509930] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612277 ] 00:06:33.739 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.739 [2024-04-24 19:13:20.616201] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1612099 has claimed it. 00:06:33.739 [2024-04-24 19:13:20.616243] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:34.306 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 832: kill: (1612277) - No such process 00:06:34.306 ERROR: process (pid: 1612277) is no longer running 00:06:34.306 19:13:21 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:34.306 19:13:21 -- common/autotest_common.sh@850 -- # return 1 00:06:34.306 19:13:21 -- common/autotest_common.sh@641 -- # es=1 00:06:34.306 19:13:21 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:34.306 19:13:21 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:34.306 19:13:21 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:34.306 19:13:21 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:34.306 19:13:21 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:34.306 19:13:21 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:34.306 19:13:21 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:34.306 19:13:21 -- event/cpu_locks.sh@141 -- # killprocess 1612099 00:06:34.306 19:13:21 -- common/autotest_common.sh@936 -- # '[' -z 1612099 ']' 00:06:34.306 19:13:21 -- common/autotest_common.sh@940 -- # kill -0 1612099 00:06:34.306 19:13:21 -- common/autotest_common.sh@941 -- # uname 00:06:34.306 19:13:21 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:34.306 19:13:21 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1612099 00:06:34.306 19:13:21 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:34.306 19:13:21 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:34.306 19:13:21 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1612099' 00:06:34.306 killing process with pid 1612099 00:06:34.306 19:13:21 -- common/autotest_common.sh@955 -- # kill 1612099 00:06:34.306 19:13:21 -- common/autotest_common.sh@960 -- # wait 1612099 00:06:34.564 00:06:34.564 real 0m1.925s 00:06:34.564 user 0m5.299s 00:06:34.564 sys 0m0.504s 00:06:34.564 19:13:21 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:34.565 19:13:21 -- common/autotest_common.sh@10 -- # set +x 00:06:34.565 ************************************ 00:06:34.565 END TEST locking_overlapped_coremask 00:06:34.565 ************************************ 00:06:34.823 19:13:21 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:34.823 19:13:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.823 19:13:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.823 19:13:21 -- common/autotest_common.sh@10 -- # set +x 00:06:34.823 ************************************ 00:06:34.823 START TEST locking_overlapped_coremask_via_rpc 00:06:34.823 ************************************ 00:06:34.823 19:13:21 -- common/autotest_common.sh@1111 -- # locking_overlapped_coremask_via_rpc 00:06:34.823 19:13:21 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:34.823 19:13:21 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1612486 00:06:34.823 19:13:21 -- event/cpu_locks.sh@149 -- # waitforlisten 1612486 /var/tmp/spdk.sock 00:06:34.823 19:13:21 -- common/autotest_common.sh@817 -- # '[' -z 1612486 ']' 00:06:34.823 19:13:21 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.823 19:13:21 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:34.823 19:13:21 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.823 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.823 19:13:21 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:34.823 19:13:21 -- common/autotest_common.sh@10 -- # set +x 00:06:34.823 [2024-04-24 19:13:21.745433] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:34.823 [2024-04-24 19:13:21.745475] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612486 ] 00:06:34.823 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.823 [2024-04-24 19:13:21.813662] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:34.823 [2024-04-24 19:13:21.813694] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:35.081 [2024-04-24 19:13:21.909325] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:35.081 [2024-04-24 19:13:21.909411] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:35.081 [2024-04-24 19:13:21.909414] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.648 19:13:22 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:35.648 19:13:22 -- common/autotest_common.sh@850 -- # return 0 00:06:35.648 19:13:22 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1612509 00:06:35.648 19:13:22 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:35.648 19:13:22 -- event/cpu_locks.sh@153 -- # waitforlisten 1612509 /var/tmp/spdk2.sock 00:06:35.648 19:13:22 -- common/autotest_common.sh@817 -- # '[' -z 1612509 ']' 00:06:35.648 19:13:22 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:35.648 19:13:22 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:35.648 19:13:22 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:35.648 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:35.648 19:13:22 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:35.648 19:13:22 -- common/autotest_common.sh@10 -- # set +x 00:06:35.648 [2024-04-24 19:13:22.616315] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:35.648 [2024-04-24 19:13:22.616404] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612509 ] 00:06:35.648 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.906 [2024-04-24 19:13:22.715677] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:35.906 [2024-04-24 19:13:22.715706] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:35.906 [2024-04-24 19:13:22.883005] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:35.907 [2024-04-24 19:13:22.886116] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:35.907 [2024-04-24 19:13:22.886118] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:36.474 19:13:23 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:36.474 19:13:23 -- common/autotest_common.sh@850 -- # return 0 00:06:36.474 19:13:23 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:36.474 19:13:23 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:36.474 19:13:23 -- common/autotest_common.sh@10 -- # set +x 00:06:36.474 19:13:23 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:36.474 19:13:23 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:36.474 19:13:23 -- common/autotest_common.sh@638 -- # local es=0 00:06:36.474 19:13:23 -- common/autotest_common.sh@640 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:36.474 19:13:23 -- common/autotest_common.sh@626 -- # local arg=rpc_cmd 00:06:36.474 19:13:23 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:36.474 19:13:23 -- common/autotest_common.sh@630 -- # type -t rpc_cmd 00:06:36.474 19:13:23 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:36.474 19:13:23 -- common/autotest_common.sh@641 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:36.474 19:13:23 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:36.474 19:13:23 -- common/autotest_common.sh@10 -- # set +x 00:06:36.474 [2024-04-24 19:13:23.462122] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1612486 has claimed it. 00:06:36.474 request: 00:06:36.474 { 00:06:36.474 "method": "framework_enable_cpumask_locks", 00:06:36.474 "req_id": 1 00:06:36.474 } 00:06:36.474 Got JSON-RPC error response 00:06:36.474 response: 00:06:36.474 { 00:06:36.474 "code": -32603, 00:06:36.474 "message": "Failed to claim CPU core: 2" 00:06:36.474 } 00:06:36.474 19:13:23 -- common/autotest_common.sh@577 -- # [[ 1 == 0 ]] 00:06:36.474 19:13:23 -- common/autotest_common.sh@641 -- # es=1 00:06:36.474 19:13:23 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:36.474 19:13:23 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:36.474 19:13:23 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:36.474 19:13:23 -- event/cpu_locks.sh@158 -- # waitforlisten 1612486 /var/tmp/spdk.sock 00:06:36.474 19:13:23 -- common/autotest_common.sh@817 -- # '[' -z 1612486 ']' 00:06:36.474 19:13:23 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:36.474 19:13:23 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:36.474 19:13:23 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:36.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:36.474 19:13:23 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:36.474 19:13:23 -- common/autotest_common.sh@10 -- # set +x 00:06:36.732 19:13:23 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:36.732 19:13:23 -- common/autotest_common.sh@850 -- # return 0 00:06:36.732 19:13:23 -- event/cpu_locks.sh@159 -- # waitforlisten 1612509 /var/tmp/spdk2.sock 00:06:36.732 19:13:23 -- common/autotest_common.sh@817 -- # '[' -z 1612509 ']' 00:06:36.732 19:13:23 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:36.732 19:13:23 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:36.732 19:13:23 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:36.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:36.732 19:13:23 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:36.732 19:13:23 -- common/autotest_common.sh@10 -- # set +x 00:06:36.991 19:13:23 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:36.991 19:13:23 -- common/autotest_common.sh@850 -- # return 0 00:06:36.991 19:13:23 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:36.991 19:13:23 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:36.991 19:13:23 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:36.991 19:13:23 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:36.991 00:06:36.991 real 0m2.118s 00:06:36.991 user 0m0.831s 00:06:36.991 sys 0m0.214s 00:06:36.991 19:13:23 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:36.991 19:13:23 -- common/autotest_common.sh@10 -- # set +x 00:06:36.991 ************************************ 00:06:36.991 END TEST locking_overlapped_coremask_via_rpc 00:06:36.991 ************************************ 00:06:36.991 19:13:23 -- event/cpu_locks.sh@174 -- # cleanup 00:06:36.991 19:13:23 -- event/cpu_locks.sh@15 -- # [[ -z 1612486 ]] 00:06:36.991 19:13:23 -- event/cpu_locks.sh@15 -- # killprocess 1612486 00:06:36.991 19:13:23 -- common/autotest_common.sh@936 -- # '[' -z 1612486 ']' 00:06:36.991 19:13:23 -- common/autotest_common.sh@940 -- # kill -0 1612486 00:06:36.991 19:13:23 -- common/autotest_common.sh@941 -- # uname 00:06:36.991 19:13:23 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:36.991 19:13:23 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1612486 00:06:36.991 19:13:23 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:36.991 19:13:23 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:36.991 19:13:23 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1612486' 00:06:36.991 killing process with pid 1612486 00:06:36.991 19:13:23 -- common/autotest_common.sh@955 -- # kill 1612486 00:06:36.991 19:13:23 -- common/autotest_common.sh@960 -- # wait 1612486 00:06:37.558 19:13:24 -- event/cpu_locks.sh@16 -- # [[ -z 1612509 ]] 00:06:37.558 19:13:24 -- event/cpu_locks.sh@16 -- # killprocess 1612509 00:06:37.558 19:13:24 -- common/autotest_common.sh@936 -- # '[' -z 1612509 ']' 00:06:37.558 19:13:24 -- common/autotest_common.sh@940 -- # kill -0 1612509 00:06:37.558 19:13:24 -- common/autotest_common.sh@941 -- # uname 00:06:37.558 19:13:24 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:37.558 19:13:24 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1612509 00:06:37.558 19:13:24 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:37.558 19:13:24 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:37.558 19:13:24 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1612509' 00:06:37.558 killing process with pid 1612509 00:06:37.558 19:13:24 -- common/autotest_common.sh@955 -- # kill 1612509 00:06:37.558 19:13:24 -- common/autotest_common.sh@960 -- # wait 1612509 00:06:37.816 19:13:24 -- event/cpu_locks.sh@18 -- # rm -f 00:06:37.816 19:13:24 -- event/cpu_locks.sh@1 -- # cleanup 00:06:37.816 19:13:24 -- event/cpu_locks.sh@15 -- # [[ -z 1612486 ]] 00:06:37.816 19:13:24 -- event/cpu_locks.sh@15 -- # killprocess 1612486 00:06:37.816 19:13:24 -- common/autotest_common.sh@936 -- # '[' -z 1612486 ']' 00:06:37.816 19:13:24 -- common/autotest_common.sh@940 -- # kill -0 1612486 00:06:37.816 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1612486) - No such process 00:06:37.816 19:13:24 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1612486 is not found' 00:06:37.816 Process with pid 1612486 is not found 00:06:37.816 19:13:24 -- event/cpu_locks.sh@16 -- # [[ -z 1612509 ]] 00:06:37.816 19:13:24 -- event/cpu_locks.sh@16 -- # killprocess 1612509 00:06:37.816 19:13:24 -- common/autotest_common.sh@936 -- # '[' -z 1612509 ']' 00:06:37.816 19:13:24 -- common/autotest_common.sh@940 -- # kill -0 1612509 00:06:37.816 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1612509) - No such process 00:06:37.816 19:13:24 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1612509 is not found' 00:06:37.816 Process with pid 1612509 is not found 00:06:37.816 19:13:24 -- event/cpu_locks.sh@18 -- # rm -f 00:06:37.816 00:06:37.816 real 0m19.139s 00:06:37.816 user 0m30.611s 00:06:37.816 sys 0m6.431s 00:06:37.816 19:13:24 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:37.816 19:13:24 -- common/autotest_common.sh@10 -- # set +x 00:06:37.816 ************************************ 00:06:37.816 END TEST cpu_locks 00:06:37.816 ************************************ 00:06:37.816 00:06:37.816 real 0m45.895s 00:06:37.816 user 1m20.796s 00:06:37.816 sys 0m11.263s 00:06:37.816 19:13:24 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:37.816 19:13:24 -- common/autotest_common.sh@10 -- # set +x 00:06:37.816 ************************************ 00:06:37.817 END TEST event 00:06:37.817 ************************************ 00:06:37.817 19:13:24 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:37.817 19:13:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:37.817 19:13:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.817 19:13:24 -- common/autotest_common.sh@10 -- # set +x 00:06:38.075 ************************************ 00:06:38.075 START TEST thread 00:06:38.075 ************************************ 00:06:38.075 19:13:24 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:38.075 * Looking for test storage... 00:06:38.075 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:38.075 19:13:25 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:38.075 19:13:25 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:38.075 19:13:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.075 19:13:25 -- common/autotest_common.sh@10 -- # set +x 00:06:38.334 ************************************ 00:06:38.334 START TEST thread_poller_perf 00:06:38.334 ************************************ 00:06:38.334 19:13:25 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:38.334 [2024-04-24 19:13:25.173909] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:38.334 [2024-04-24 19:13:25.173996] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612966 ] 00:06:38.334 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.334 [2024-04-24 19:13:25.252249] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.334 [2024-04-24 19:13:25.337132] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.334 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:39.708 ====================================== 00:06:39.708 busy:2304578476 (cyc) 00:06:39.708 total_run_count: 845000 00:06:39.708 tsc_hz: 2300000000 (cyc) 00:06:39.708 ====================================== 00:06:39.708 poller_cost: 2727 (cyc), 1185 (nsec) 00:06:39.708 00:06:39.708 real 0m1.260s 00:06:39.708 user 0m1.156s 00:06:39.708 sys 0m0.099s 00:06:39.708 19:13:26 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:39.708 19:13:26 -- common/autotest_common.sh@10 -- # set +x 00:06:39.708 ************************************ 00:06:39.708 END TEST thread_poller_perf 00:06:39.708 ************************************ 00:06:39.708 19:13:26 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:39.708 19:13:26 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:39.708 19:13:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:39.708 19:13:26 -- common/autotest_common.sh@10 -- # set +x 00:06:39.708 ************************************ 00:06:39.708 START TEST thread_poller_perf 00:06:39.708 ************************************ 00:06:39.708 19:13:26 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:39.708 [2024-04-24 19:13:26.635012] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:39.708 [2024-04-24 19:13:26.635107] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613170 ] 00:06:39.708 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.708 [2024-04-24 19:13:26.713244] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.966 [2024-04-24 19:13:26.810831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.966 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:40.903 ====================================== 00:06:40.903 busy:2301444426 (cyc) 00:06:40.903 total_run_count: 13427000 00:06:40.903 tsc_hz: 2300000000 (cyc) 00:06:40.903 ====================================== 00:06:40.903 poller_cost: 171 (cyc), 74 (nsec) 00:06:40.903 00:06:40.903 real 0m1.271s 00:06:40.903 user 0m1.165s 00:06:40.903 sys 0m0.101s 00:06:40.903 19:13:27 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:40.903 19:13:27 -- common/autotest_common.sh@10 -- # set +x 00:06:40.903 ************************************ 00:06:40.903 END TEST thread_poller_perf 00:06:40.903 ************************************ 00:06:41.162 19:13:27 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:41.162 19:13:27 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:41.162 19:13:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:41.162 19:13:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.162 19:13:27 -- common/autotest_common.sh@10 -- # set +x 00:06:41.162 ************************************ 00:06:41.162 START TEST thread_spdk_lock 00:06:41.162 ************************************ 00:06:41.162 19:13:28 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:41.162 [2024-04-24 19:13:28.106226] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:41.162 [2024-04-24 19:13:28.106312] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613380 ] 00:06:41.162 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.420 [2024-04-24 19:13:28.188886] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:41.420 [2024-04-24 19:13:28.278672] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:41.420 [2024-04-24 19:13:28.278674] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.986 [2024-04-24 19:13:28.771066] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.987 [2024-04-24 19:13:28.771104] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:41.987 [2024-04-24 19:13:28.771115] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x14b6140 00:06:41.987 [2024-04-24 19:13:28.771957] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.987 [2024-04-24 19:13:28.772067] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.987 [2024-04-24 19:13:28.772085] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:41.987 Starting test contend 00:06:41.987 Worker Delay Wait us Hold us Total us 00:06:41.987 0 3 173108 186582 359690 00:06:41.987 1 5 92199 285615 377814 00:06:41.987 PASS test contend 00:06:41.987 Starting test hold_by_poller 00:06:41.987 PASS test hold_by_poller 00:06:41.987 Starting test hold_by_message 00:06:41.987 PASS test hold_by_message 00:06:41.987 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:41.987 100014 assertions passed 00:06:41.987 0 assertions failed 00:06:41.987 00:06:41.987 real 0m0.760s 00:06:41.987 user 0m1.148s 00:06:41.987 sys 0m0.100s 00:06:41.987 19:13:28 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:41.987 19:13:28 -- common/autotest_common.sh@10 -- # set +x 00:06:41.987 ************************************ 00:06:41.987 END TEST thread_spdk_lock 00:06:41.987 ************************************ 00:06:41.987 00:06:41.987 real 0m3.990s 00:06:41.987 user 0m3.727s 00:06:41.987 sys 0m0.695s 00:06:41.987 19:13:28 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:41.987 19:13:28 -- common/autotest_common.sh@10 -- # set +x 00:06:41.987 ************************************ 00:06:41.987 END TEST thread 00:06:41.987 ************************************ 00:06:41.987 19:13:28 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:41.987 19:13:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:41.987 19:13:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.987 19:13:28 -- common/autotest_common.sh@10 -- # set +x 00:06:42.245 ************************************ 00:06:42.245 START TEST accel 00:06:42.245 ************************************ 00:06:42.245 19:13:29 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:42.245 * Looking for test storage... 00:06:42.245 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:42.245 19:13:29 -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:42.245 19:13:29 -- accel/accel.sh@82 -- # get_expected_opcs 00:06:42.245 19:13:29 -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:42.245 19:13:29 -- accel/accel.sh@62 -- # spdk_tgt_pid=1613614 00:06:42.245 19:13:29 -- accel/accel.sh@63 -- # waitforlisten 1613614 00:06:42.245 19:13:29 -- common/autotest_common.sh@817 -- # '[' -z 1613614 ']' 00:06:42.245 19:13:29 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:42.245 19:13:29 -- common/autotest_common.sh@822 -- # local max_retries=100 00:06:42.245 19:13:29 -- accel/accel.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:42.245 19:13:29 -- accel/accel.sh@61 -- # build_accel_config 00:06:42.245 19:13:29 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:42.245 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:42.245 19:13:29 -- common/autotest_common.sh@826 -- # xtrace_disable 00:06:42.245 19:13:29 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:42.245 19:13:29 -- common/autotest_common.sh@10 -- # set +x 00:06:42.245 19:13:29 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:42.245 19:13:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.245 19:13:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.245 19:13:29 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:42.245 19:13:29 -- accel/accel.sh@40 -- # local IFS=, 00:06:42.245 19:13:29 -- accel/accel.sh@41 -- # jq -r . 00:06:42.246 [2024-04-24 19:13:29.213902] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:42.246 [2024-04-24 19:13:29.213996] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613614 ] 00:06:42.246 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.504 [2024-04-24 19:13:29.291287] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.504 [2024-04-24 19:13:29.371929] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.071 19:13:30 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:06:43.071 19:13:30 -- common/autotest_common.sh@850 -- # return 0 00:06:43.071 19:13:30 -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:43.071 19:13:30 -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:43.071 19:13:30 -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:43.071 19:13:30 -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:43.071 19:13:30 -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:43.071 19:13:30 -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:43.071 19:13:30 -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:43.071 19:13:30 -- common/autotest_common.sh@549 -- # xtrace_disable 00:06:43.071 19:13:30 -- common/autotest_common.sh@10 -- # set +x 00:06:43.071 19:13:30 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.071 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.071 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.071 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.331 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.331 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.331 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.331 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.331 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.331 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.331 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.331 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.331 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.331 19:13:30 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # IFS== 00:06:43.331 19:13:30 -- accel/accel.sh@72 -- # read -r opc module 00:06:43.331 19:13:30 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:43.331 19:13:30 -- accel/accel.sh@75 -- # killprocess 1613614 00:06:43.331 19:13:30 -- common/autotest_common.sh@936 -- # '[' -z 1613614 ']' 00:06:43.331 19:13:30 -- common/autotest_common.sh@940 -- # kill -0 1613614 00:06:43.331 19:13:30 -- common/autotest_common.sh@941 -- # uname 00:06:43.331 19:13:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:43.331 19:13:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1613614 00:06:43.331 19:13:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:43.331 19:13:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:43.331 19:13:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1613614' 00:06:43.331 killing process with pid 1613614 00:06:43.331 19:13:30 -- common/autotest_common.sh@955 -- # kill 1613614 00:06:43.331 19:13:30 -- common/autotest_common.sh@960 -- # wait 1613614 00:06:43.590 19:13:30 -- accel/accel.sh@76 -- # trap - ERR 00:06:43.590 19:13:30 -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:43.590 19:13:30 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:43.590 19:13:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.590 19:13:30 -- common/autotest_common.sh@10 -- # set +x 00:06:43.849 19:13:30 -- common/autotest_common.sh@1111 -- # accel_perf -h 00:06:43.849 19:13:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:43.849 19:13:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.849 19:13:30 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:43.849 19:13:30 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:43.849 19:13:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.849 19:13:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.849 19:13:30 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:43.849 19:13:30 -- accel/accel.sh@40 -- # local IFS=, 00:06:43.849 19:13:30 -- accel/accel.sh@41 -- # jq -r . 00:06:43.849 19:13:30 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:43.849 19:13:30 -- common/autotest_common.sh@10 -- # set +x 00:06:43.849 19:13:30 -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:43.849 19:13:30 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:43.849 19:13:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.849 19:13:30 -- common/autotest_common.sh@10 -- # set +x 00:06:43.849 ************************************ 00:06:43.849 START TEST accel_missing_filename 00:06:43.849 ************************************ 00:06:43.849 19:13:30 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress 00:06:43.849 19:13:30 -- common/autotest_common.sh@638 -- # local es=0 00:06:43.849 19:13:30 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:43.849 19:13:30 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:43.849 19:13:30 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:43.849 19:13:30 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:43.849 19:13:30 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:43.849 19:13:30 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress 00:06:43.849 19:13:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:43.849 19:13:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.849 19:13:30 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:43.849 19:13:30 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:43.849 19:13:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.849 19:13:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.849 19:13:30 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:43.849 19:13:30 -- accel/accel.sh@40 -- # local IFS=, 00:06:43.849 19:13:30 -- accel/accel.sh@41 -- # jq -r . 00:06:44.108 [2024-04-24 19:13:30.877647] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:44.108 [2024-04-24 19:13:30.877726] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613900 ] 00:06:44.108 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.108 [2024-04-24 19:13:30.956389] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.108 [2024-04-24 19:13:31.038276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.108 [2024-04-24 19:13:31.084897] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:44.367 [2024-04-24 19:13:31.154120] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:06:44.367 A filename is required. 00:06:44.367 19:13:31 -- common/autotest_common.sh@641 -- # es=234 00:06:44.367 19:13:31 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:44.367 19:13:31 -- common/autotest_common.sh@650 -- # es=106 00:06:44.367 19:13:31 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:44.367 19:13:31 -- common/autotest_common.sh@658 -- # es=1 00:06:44.367 19:13:31 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:44.367 00:06:44.367 real 0m0.378s 00:06:44.367 user 0m0.263s 00:06:44.367 sys 0m0.154s 00:06:44.367 19:13:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:44.367 19:13:31 -- common/autotest_common.sh@10 -- # set +x 00:06:44.367 ************************************ 00:06:44.367 END TEST accel_missing_filename 00:06:44.367 ************************************ 00:06:44.367 19:13:31 -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:44.367 19:13:31 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:44.367 19:13:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:44.367 19:13:31 -- common/autotest_common.sh@10 -- # set +x 00:06:44.625 ************************************ 00:06:44.625 START TEST accel_compress_verify 00:06:44.625 ************************************ 00:06:44.625 19:13:31 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:44.625 19:13:31 -- common/autotest_common.sh@638 -- # local es=0 00:06:44.625 19:13:31 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:44.625 19:13:31 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:44.625 19:13:31 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:44.625 19:13:31 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:44.625 19:13:31 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:44.625 19:13:31 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:44.625 19:13:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.625 19:13:31 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:44.625 19:13:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:44.625 19:13:31 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:44.625 19:13:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.625 19:13:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.625 19:13:31 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:44.625 19:13:31 -- accel/accel.sh@40 -- # local IFS=, 00:06:44.625 19:13:31 -- accel/accel.sh@41 -- # jq -r . 00:06:44.625 [2024-04-24 19:13:31.457579] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:44.625 [2024-04-24 19:13:31.457666] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614037 ] 00:06:44.625 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.625 [2024-04-24 19:13:31.533032] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.625 [2024-04-24 19:13:31.615486] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.883 [2024-04-24 19:13:31.662451] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:44.883 [2024-04-24 19:13:31.731458] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:06:44.883 00:06:44.883 Compression does not support the verify option, aborting. 00:06:44.883 19:13:31 -- common/autotest_common.sh@641 -- # es=161 00:06:44.883 19:13:31 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:44.883 19:13:31 -- common/autotest_common.sh@650 -- # es=33 00:06:44.883 19:13:31 -- common/autotest_common.sh@651 -- # case "$es" in 00:06:44.883 19:13:31 -- common/autotest_common.sh@658 -- # es=1 00:06:44.883 19:13:31 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:44.883 00:06:44.883 real 0m0.374s 00:06:44.883 user 0m0.244s 00:06:44.883 sys 0m0.150s 00:06:44.883 19:13:31 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:44.883 19:13:31 -- common/autotest_common.sh@10 -- # set +x 00:06:44.883 ************************************ 00:06:44.883 END TEST accel_compress_verify 00:06:44.883 ************************************ 00:06:44.883 19:13:31 -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:44.883 19:13:31 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:44.883 19:13:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:44.883 19:13:31 -- common/autotest_common.sh@10 -- # set +x 00:06:45.143 ************************************ 00:06:45.143 START TEST accel_wrong_workload 00:06:45.143 ************************************ 00:06:45.143 19:13:32 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w foobar 00:06:45.143 19:13:32 -- common/autotest_common.sh@638 -- # local es=0 00:06:45.143 19:13:32 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:45.143 19:13:32 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:45.143 19:13:32 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:45.143 19:13:32 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:45.143 19:13:32 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:45.143 19:13:32 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w foobar 00:06:45.143 19:13:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:45.143 19:13:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.143 19:13:32 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.143 19:13:32 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.143 19:13:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.143 19:13:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.143 19:13:32 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.143 19:13:32 -- accel/accel.sh@40 -- # local IFS=, 00:06:45.143 19:13:32 -- accel/accel.sh@41 -- # jq -r . 00:06:45.143 Unsupported workload type: foobar 00:06:45.143 [2024-04-24 19:13:32.027896] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:45.143 accel_perf options: 00:06:45.143 [-h help message] 00:06:45.143 [-q queue depth per core] 00:06:45.143 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:45.143 [-T number of threads per core 00:06:45.143 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:45.143 [-t time in seconds] 00:06:45.143 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:45.143 [ dif_verify, , dif_generate, dif_generate_copy 00:06:45.143 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:45.143 [-l for compress/decompress workloads, name of uncompressed input file 00:06:45.143 [-S for crc32c workload, use this seed value (default 0) 00:06:45.143 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:45.143 [-f for fill workload, use this BYTE value (default 255) 00:06:45.143 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:45.143 [-y verify result if this switch is on] 00:06:45.143 [-a tasks to allocate per core (default: same value as -q)] 00:06:45.143 Can be used to spread operations across a wider range of memory. 00:06:45.143 19:13:32 -- common/autotest_common.sh@641 -- # es=1 00:06:45.143 19:13:32 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:45.143 19:13:32 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:45.143 19:13:32 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:45.143 00:06:45.143 real 0m0.029s 00:06:45.143 user 0m0.012s 00:06:45.143 sys 0m0.017s 00:06:45.143 19:13:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:45.143 19:13:32 -- common/autotest_common.sh@10 -- # set +x 00:06:45.143 ************************************ 00:06:45.143 END TEST accel_wrong_workload 00:06:45.143 ************************************ 00:06:45.143 Error: writing output failed: Broken pipe 00:06:45.143 19:13:32 -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:45.143 19:13:32 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:45.143 19:13:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:45.143 19:13:32 -- common/autotest_common.sh@10 -- # set +x 00:06:45.402 ************************************ 00:06:45.402 START TEST accel_negative_buffers 00:06:45.402 ************************************ 00:06:45.402 19:13:32 -- common/autotest_common.sh@1111 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:45.402 19:13:32 -- common/autotest_common.sh@638 -- # local es=0 00:06:45.402 19:13:32 -- common/autotest_common.sh@640 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:45.402 19:13:32 -- common/autotest_common.sh@626 -- # local arg=accel_perf 00:06:45.403 19:13:32 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:45.403 19:13:32 -- common/autotest_common.sh@630 -- # type -t accel_perf 00:06:45.403 19:13:32 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:06:45.403 19:13:32 -- common/autotest_common.sh@641 -- # accel_perf -t 1 -w xor -y -x -1 00:06:45.403 19:13:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:45.403 19:13:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.403 19:13:32 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.403 19:13:32 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.403 19:13:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.403 19:13:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.403 19:13:32 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.403 19:13:32 -- accel/accel.sh@40 -- # local IFS=, 00:06:45.403 19:13:32 -- accel/accel.sh@41 -- # jq -r . 00:06:45.403 -x option must be non-negative. 00:06:45.403 [2024-04-24 19:13:32.222447] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:45.403 accel_perf options: 00:06:45.403 [-h help message] 00:06:45.403 [-q queue depth per core] 00:06:45.403 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:45.403 [-T number of threads per core 00:06:45.403 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:45.403 [-t time in seconds] 00:06:45.403 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:45.403 [ dif_verify, , dif_generate, dif_generate_copy 00:06:45.403 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:45.403 [-l for compress/decompress workloads, name of uncompressed input file 00:06:45.403 [-S for crc32c workload, use this seed value (default 0) 00:06:45.403 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:45.403 [-f for fill workload, use this BYTE value (default 255) 00:06:45.403 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:45.403 [-y verify result if this switch is on] 00:06:45.403 [-a tasks to allocate per core (default: same value as -q)] 00:06:45.403 Can be used to spread operations across a wider range of memory. 00:06:45.403 19:13:32 -- common/autotest_common.sh@641 -- # es=1 00:06:45.403 19:13:32 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:06:45.403 19:13:32 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:06:45.403 19:13:32 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:06:45.403 00:06:45.403 real 0m0.027s 00:06:45.403 user 0m0.013s 00:06:45.403 sys 0m0.015s 00:06:45.403 19:13:32 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:45.403 19:13:32 -- common/autotest_common.sh@10 -- # set +x 00:06:45.403 ************************************ 00:06:45.403 END TEST accel_negative_buffers 00:06:45.403 ************************************ 00:06:45.403 Error: writing output failed: Broken pipe 00:06:45.403 19:13:32 -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:45.403 19:13:32 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:45.403 19:13:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:45.403 19:13:32 -- common/autotest_common.sh@10 -- # set +x 00:06:45.662 ************************************ 00:06:45.662 START TEST accel_crc32c 00:06:45.662 ************************************ 00:06:45.662 19:13:32 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:45.662 19:13:32 -- accel/accel.sh@16 -- # local accel_opc 00:06:45.662 19:13:32 -- accel/accel.sh@17 -- # local accel_module 00:06:45.662 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.662 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.662 19:13:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:45.662 19:13:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:45.662 19:13:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.662 19:13:32 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:45.662 19:13:32 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:45.662 19:13:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.662 19:13:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.662 19:13:32 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:45.662 19:13:32 -- accel/accel.sh@40 -- # local IFS=, 00:06:45.662 19:13:32 -- accel/accel.sh@41 -- # jq -r . 00:06:45.662 [2024-04-24 19:13:32.467648] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:45.662 [2024-04-24 19:13:32.467734] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614287 ] 00:06:45.662 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.662 [2024-04-24 19:13:32.544721] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.662 [2024-04-24 19:13:32.634110] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val= 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val= 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=0x1 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val= 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val= 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=crc32c 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=32 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val= 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=software 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@22 -- # accel_module=software 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=32 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=32 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=1 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val=Yes 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val= 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:45.922 19:13:32 -- accel/accel.sh@20 -- # val= 00:06:45.922 19:13:32 -- accel/accel.sh@21 -- # case "$var" in 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # IFS=: 00:06:45.922 19:13:32 -- accel/accel.sh@19 -- # read -r var val 00:06:46.857 19:13:33 -- accel/accel.sh@20 -- # val= 00:06:46.857 19:13:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # IFS=: 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # read -r var val 00:06:46.857 19:13:33 -- accel/accel.sh@20 -- # val= 00:06:46.857 19:13:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # IFS=: 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # read -r var val 00:06:46.857 19:13:33 -- accel/accel.sh@20 -- # val= 00:06:46.857 19:13:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # IFS=: 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # read -r var val 00:06:46.857 19:13:33 -- accel/accel.sh@20 -- # val= 00:06:46.857 19:13:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # IFS=: 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # read -r var val 00:06:46.857 19:13:33 -- accel/accel.sh@20 -- # val= 00:06:46.857 19:13:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # IFS=: 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # read -r var val 00:06:46.857 19:13:33 -- accel/accel.sh@20 -- # val= 00:06:46.857 19:13:33 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # IFS=: 00:06:46.857 19:13:33 -- accel/accel.sh@19 -- # read -r var val 00:06:46.857 19:13:33 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:46.857 19:13:33 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:46.857 19:13:33 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.857 00:06:46.857 real 0m1.391s 00:06:46.857 user 0m1.251s 00:06:46.857 sys 0m0.152s 00:06:46.857 19:13:33 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:46.857 19:13:33 -- common/autotest_common.sh@10 -- # set +x 00:06:46.857 ************************************ 00:06:46.857 END TEST accel_crc32c 00:06:46.857 ************************************ 00:06:47.115 19:13:33 -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:47.115 19:13:33 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:47.115 19:13:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:47.115 19:13:33 -- common/autotest_common.sh@10 -- # set +x 00:06:47.115 ************************************ 00:06:47.115 START TEST accel_crc32c_C2 00:06:47.115 ************************************ 00:06:47.115 19:13:34 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:47.115 19:13:34 -- accel/accel.sh@16 -- # local accel_opc 00:06:47.115 19:13:34 -- accel/accel.sh@17 -- # local accel_module 00:06:47.115 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.115 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.115 19:13:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:47.115 19:13:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:47.115 19:13:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.115 19:13:34 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:47.115 19:13:34 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:47.115 19:13:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.115 19:13:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.115 19:13:34 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:47.115 19:13:34 -- accel/accel.sh@40 -- # local IFS=, 00:06:47.115 19:13:34 -- accel/accel.sh@41 -- # jq -r . 00:06:47.115 [2024-04-24 19:13:34.030230] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:47.116 [2024-04-24 19:13:34.030315] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614493 ] 00:06:47.116 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.116 [2024-04-24 19:13:34.105817] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.374 [2024-04-24 19:13:34.189383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val= 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val= 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val=0x1 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val= 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val= 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val=crc32c 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val=0 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val= 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val=software 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@22 -- # accel_module=software 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val=32 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val=32 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val=1 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.374 19:13:34 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:47.374 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.374 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.375 19:13:34 -- accel/accel.sh@20 -- # val=Yes 00:06:47.375 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.375 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.375 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.375 19:13:34 -- accel/accel.sh@20 -- # val= 00:06:47.375 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.375 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.375 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:47.375 19:13:34 -- accel/accel.sh@20 -- # val= 00:06:47.375 19:13:34 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.375 19:13:34 -- accel/accel.sh@19 -- # IFS=: 00:06:47.375 19:13:34 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:48.751 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:48.751 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:48.751 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:48.751 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:48.751 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:48.751 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:48.751 19:13:35 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:48.751 19:13:35 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:48.751 00:06:48.751 real 0m1.381s 00:06:48.751 user 0m1.244s 00:06:48.751 sys 0m0.148s 00:06:48.751 19:13:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:48.751 19:13:35 -- common/autotest_common.sh@10 -- # set +x 00:06:48.751 ************************************ 00:06:48.751 END TEST accel_crc32c_C2 00:06:48.751 ************************************ 00:06:48.751 19:13:35 -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:48.751 19:13:35 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:48.751 19:13:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:48.751 19:13:35 -- common/autotest_common.sh@10 -- # set +x 00:06:48.751 ************************************ 00:06:48.751 START TEST accel_copy 00:06:48.751 ************************************ 00:06:48.751 19:13:35 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy -y 00:06:48.751 19:13:35 -- accel/accel.sh@16 -- # local accel_opc 00:06:48.751 19:13:35 -- accel/accel.sh@17 -- # local accel_module 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:48.751 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:48.751 19:13:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:48.751 19:13:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:48.751 19:13:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.751 19:13:35 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:48.751 19:13:35 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:48.751 19:13:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.751 19:13:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.751 19:13:35 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:48.751 19:13:35 -- accel/accel.sh@40 -- # local IFS=, 00:06:48.751 19:13:35 -- accel/accel.sh@41 -- # jq -r . 00:06:48.751 [2024-04-24 19:13:35.588765] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:48.751 [2024-04-24 19:13:35.588844] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614693 ] 00:06:48.751 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.751 [2024-04-24 19:13:35.667530] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.751 [2024-04-24 19:13:35.749464] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val=0x1 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val=copy 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@23 -- # accel_opc=copy 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val=software 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@22 -- # accel_module=software 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val=32 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val=32 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val=1 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val=Yes 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:49.010 19:13:35 -- accel/accel.sh@20 -- # val= 00:06:49.010 19:13:35 -- accel/accel.sh@21 -- # case "$var" in 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # IFS=: 00:06:49.010 19:13:35 -- accel/accel.sh@19 -- # read -r var val 00:06:50.019 19:13:36 -- accel/accel.sh@20 -- # val= 00:06:50.019 19:13:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # IFS=: 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # read -r var val 00:06:50.019 19:13:36 -- accel/accel.sh@20 -- # val= 00:06:50.019 19:13:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # IFS=: 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # read -r var val 00:06:50.019 19:13:36 -- accel/accel.sh@20 -- # val= 00:06:50.019 19:13:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # IFS=: 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # read -r var val 00:06:50.019 19:13:36 -- accel/accel.sh@20 -- # val= 00:06:50.019 19:13:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # IFS=: 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # read -r var val 00:06:50.019 19:13:36 -- accel/accel.sh@20 -- # val= 00:06:50.019 19:13:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # IFS=: 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # read -r var val 00:06:50.019 19:13:36 -- accel/accel.sh@20 -- # val= 00:06:50.019 19:13:36 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # IFS=: 00:06:50.019 19:13:36 -- accel/accel.sh@19 -- # read -r var val 00:06:50.019 19:13:36 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:50.019 19:13:36 -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:50.019 19:13:36 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.019 00:06:50.019 real 0m1.374s 00:06:50.019 user 0m1.245s 00:06:50.019 sys 0m0.140s 00:06:50.019 19:13:36 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:50.019 19:13:36 -- common/autotest_common.sh@10 -- # set +x 00:06:50.019 ************************************ 00:06:50.019 END TEST accel_copy 00:06:50.019 ************************************ 00:06:50.019 19:13:36 -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.019 19:13:36 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:50.019 19:13:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:50.019 19:13:36 -- common/autotest_common.sh@10 -- # set +x 00:06:50.277 ************************************ 00:06:50.277 START TEST accel_fill 00:06:50.277 ************************************ 00:06:50.277 19:13:37 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.277 19:13:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:50.277 19:13:37 -- accel/accel.sh@17 -- # local accel_module 00:06:50.277 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.277 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.277 19:13:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.277 19:13:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:50.277 19:13:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.277 19:13:37 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:50.277 19:13:37 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:50.277 19:13:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.278 19:13:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.278 19:13:37 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:50.278 19:13:37 -- accel/accel.sh@40 -- # local IFS=, 00:06:50.278 19:13:37 -- accel/accel.sh@41 -- # jq -r . 00:06:50.278 [2024-04-24 19:13:37.157412] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:50.278 [2024-04-24 19:13:37.157494] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614907 ] 00:06:50.278 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.278 [2024-04-24 19:13:37.232905] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.536 [2024-04-24 19:13:37.317624] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.536 19:13:37 -- accel/accel.sh@20 -- # val= 00:06:50.536 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.536 19:13:37 -- accel/accel.sh@20 -- # val= 00:06:50.536 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.536 19:13:37 -- accel/accel.sh@20 -- # val=0x1 00:06:50.536 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.536 19:13:37 -- accel/accel.sh@20 -- # val= 00:06:50.536 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.536 19:13:37 -- accel/accel.sh@20 -- # val= 00:06:50.536 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.536 19:13:37 -- accel/accel.sh@20 -- # val=fill 00:06:50.536 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.536 19:13:37 -- accel/accel.sh@23 -- # accel_opc=fill 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.536 19:13:37 -- accel/accel.sh@20 -- # val=0x80 00:06:50.536 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.536 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val= 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val=software 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@22 -- # accel_module=software 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val=64 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val=64 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val=1 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val=Yes 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val= 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:50.537 19:13:37 -- accel/accel.sh@20 -- # val= 00:06:50.537 19:13:37 -- accel/accel.sh@21 -- # case "$var" in 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # IFS=: 00:06:50.537 19:13:37 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:51.919 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:51.919 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:51.919 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:51.919 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:51.919 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:51.919 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:51.919 19:13:38 -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:51.919 19:13:38 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.919 00:06:51.919 real 0m1.381s 00:06:51.919 user 0m1.252s 00:06:51.919 sys 0m0.142s 00:06:51.919 19:13:38 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:51.919 19:13:38 -- common/autotest_common.sh@10 -- # set +x 00:06:51.919 ************************************ 00:06:51.919 END TEST accel_fill 00:06:51.919 ************************************ 00:06:51.919 19:13:38 -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:51.919 19:13:38 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:51.919 19:13:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:51.919 19:13:38 -- common/autotest_common.sh@10 -- # set +x 00:06:51.919 ************************************ 00:06:51.919 START TEST accel_copy_crc32c 00:06:51.919 ************************************ 00:06:51.919 19:13:38 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y 00:06:51.919 19:13:38 -- accel/accel.sh@16 -- # local accel_opc 00:06:51.919 19:13:38 -- accel/accel.sh@17 -- # local accel_module 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:51.919 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:51.919 19:13:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:51.919 19:13:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.919 19:13:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:51.919 19:13:38 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:51.919 19:13:38 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:51.919 19:13:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.919 19:13:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.919 19:13:38 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:51.919 19:13:38 -- accel/accel.sh@40 -- # local IFS=, 00:06:51.920 19:13:38 -- accel/accel.sh@41 -- # jq -r . 00:06:51.920 [2024-04-24 19:13:38.732453] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:51.920 [2024-04-24 19:13:38.732534] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615108 ] 00:06:51.920 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.920 [2024-04-24 19:13:38.806849] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.920 [2024-04-24 19:13:38.891238] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=0x1 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=0 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=software 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@22 -- # accel_module=software 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=32 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=32 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=1 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val=Yes 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:52.179 19:13:38 -- accel/accel.sh@20 -- # val= 00:06:52.179 19:13:38 -- accel/accel.sh@21 -- # case "$var" in 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # IFS=: 00:06:52.179 19:13:38 -- accel/accel.sh@19 -- # read -r var val 00:06:53.115 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.115 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.115 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.115 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.115 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.115 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.115 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.115 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.115 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.115 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.115 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.115 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.115 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.115 19:13:40 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:53.115 19:13:40 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:53.115 19:13:40 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.115 00:06:53.115 real 0m1.383s 00:06:53.115 user 0m1.248s 00:06:53.115 sys 0m0.148s 00:06:53.115 19:13:40 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:53.115 19:13:40 -- common/autotest_common.sh@10 -- # set +x 00:06:53.115 ************************************ 00:06:53.115 END TEST accel_copy_crc32c 00:06:53.115 ************************************ 00:06:53.373 19:13:40 -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:53.373 19:13:40 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:53.373 19:13:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:53.373 19:13:40 -- common/autotest_common.sh@10 -- # set +x 00:06:53.373 ************************************ 00:06:53.373 START TEST accel_copy_crc32c_C2 00:06:53.373 ************************************ 00:06:53.373 19:13:40 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:53.373 19:13:40 -- accel/accel.sh@16 -- # local accel_opc 00:06:53.373 19:13:40 -- accel/accel.sh@17 -- # local accel_module 00:06:53.373 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.373 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.373 19:13:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:53.373 19:13:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:53.373 19:13:40 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.373 19:13:40 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.373 19:13:40 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.373 19:13:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.373 19:13:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.373 19:13:40 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.373 19:13:40 -- accel/accel.sh@40 -- # local IFS=, 00:06:53.373 19:13:40 -- accel/accel.sh@41 -- # jq -r . 00:06:53.373 [2024-04-24 19:13:40.307206] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:53.373 [2024-04-24 19:13:40.307287] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615366 ] 00:06:53.373 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.373 [2024-04-24 19:13:40.383385] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.632 [2024-04-24 19:13:40.469229] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=0x1 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=0 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=software 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@22 -- # accel_module=software 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=32 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=32 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=1 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val=Yes 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:53.632 19:13:40 -- accel/accel.sh@20 -- # val= 00:06:53.632 19:13:40 -- accel/accel.sh@21 -- # case "$var" in 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # IFS=: 00:06:53.632 19:13:40 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@20 -- # val= 00:06:55.007 19:13:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # IFS=: 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@20 -- # val= 00:06:55.007 19:13:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # IFS=: 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@20 -- # val= 00:06:55.007 19:13:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # IFS=: 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@20 -- # val= 00:06:55.007 19:13:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # IFS=: 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@20 -- # val= 00:06:55.007 19:13:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # IFS=: 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@20 -- # val= 00:06:55.007 19:13:41 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # IFS=: 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:55.007 19:13:41 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:55.007 19:13:41 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.007 00:06:55.007 real 0m1.385s 00:06:55.007 user 0m1.259s 00:06:55.007 sys 0m0.139s 00:06:55.007 19:13:41 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:55.007 19:13:41 -- common/autotest_common.sh@10 -- # set +x 00:06:55.007 ************************************ 00:06:55.007 END TEST accel_copy_crc32c_C2 00:06:55.007 ************************************ 00:06:55.007 19:13:41 -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:55.007 19:13:41 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:55.007 19:13:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.007 19:13:41 -- common/autotest_common.sh@10 -- # set +x 00:06:55.007 ************************************ 00:06:55.007 START TEST accel_dualcast 00:06:55.007 ************************************ 00:06:55.007 19:13:41 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dualcast -y 00:06:55.007 19:13:41 -- accel/accel.sh@16 -- # local accel_opc 00:06:55.007 19:13:41 -- accel/accel.sh@17 -- # local accel_module 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # IFS=: 00:06:55.007 19:13:41 -- accel/accel.sh@19 -- # read -r var val 00:06:55.007 19:13:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:55.007 19:13:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:55.007 19:13:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.007 19:13:41 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:55.007 19:13:41 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:55.007 19:13:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.007 19:13:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.007 19:13:41 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:55.007 19:13:41 -- accel/accel.sh@40 -- # local IFS=, 00:06:55.007 19:13:41 -- accel/accel.sh@41 -- # jq -r . 00:06:55.007 [2024-04-24 19:13:41.881126] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:55.007 [2024-04-24 19:13:41.881225] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615681 ] 00:06:55.007 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.007 [2024-04-24 19:13:41.955467] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.265 [2024-04-24 19:13:42.036731] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val= 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val= 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val=0x1 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val= 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val= 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val=dualcast 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@23 -- # accel_opc=dualcast 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val= 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val=software 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@22 -- # accel_module=software 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val=32 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val=32 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val=1 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.265 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.265 19:13:42 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:55.265 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.266 19:13:42 -- accel/accel.sh@20 -- # val=Yes 00:06:55.266 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.266 19:13:42 -- accel/accel.sh@20 -- # val= 00:06:55.266 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:55.266 19:13:42 -- accel/accel.sh@20 -- # val= 00:06:55.266 19:13:42 -- accel/accel.sh@21 -- # case "$var" in 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # IFS=: 00:06:55.266 19:13:42 -- accel/accel.sh@19 -- # read -r var val 00:06:56.197 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.198 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.198 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.198 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.456 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.456 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.456 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.456 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.456 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.456 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.456 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.456 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.456 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.456 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.456 19:13:43 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:56.456 19:13:43 -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:06:56.456 19:13:43 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.456 00:06:56.456 real 0m1.360s 00:06:56.456 user 0m1.235s 00:06:56.456 sys 0m0.138s 00:06:56.456 19:13:43 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:56.456 19:13:43 -- common/autotest_common.sh@10 -- # set +x 00:06:56.456 ************************************ 00:06:56.456 END TEST accel_dualcast 00:06:56.456 ************************************ 00:06:56.456 19:13:43 -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:56.456 19:13:43 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:56.456 19:13:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:56.456 19:13:43 -- common/autotest_common.sh@10 -- # set +x 00:06:56.456 ************************************ 00:06:56.456 START TEST accel_compare 00:06:56.456 ************************************ 00:06:56.456 19:13:43 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compare -y 00:06:56.456 19:13:43 -- accel/accel.sh@16 -- # local accel_opc 00:06:56.456 19:13:43 -- accel/accel.sh@17 -- # local accel_module 00:06:56.456 19:13:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.456 19:13:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:56.456 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.456 19:13:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.456 19:13:43 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:56.456 19:13:43 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:56.456 19:13:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.456 19:13:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.456 19:13:43 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:56.456 19:13:43 -- accel/accel.sh@40 -- # local IFS=, 00:06:56.456 19:13:43 -- accel/accel.sh@41 -- # jq -r . 00:06:56.456 [2024-04-24 19:13:43.397807] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:56.456 [2024-04-24 19:13:43.397849] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615882 ] 00:06:56.456 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.456 [2024-04-24 19:13:43.467831] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.716 [2024-04-24 19:13:43.550731] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val=0x1 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val=compare 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@23 -- # accel_opc=compare 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val=software 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@22 -- # accel_module=software 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val=32 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val=32 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val=1 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val=Yes 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:56.716 19:13:43 -- accel/accel.sh@20 -- # val= 00:06:56.716 19:13:43 -- accel/accel.sh@21 -- # case "$var" in 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # IFS=: 00:06:56.716 19:13:43 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@20 -- # val= 00:06:58.096 19:13:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # IFS=: 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@20 -- # val= 00:06:58.096 19:13:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # IFS=: 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@20 -- # val= 00:06:58.096 19:13:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # IFS=: 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@20 -- # val= 00:06:58.096 19:13:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # IFS=: 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@20 -- # val= 00:06:58.096 19:13:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # IFS=: 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@20 -- # val= 00:06:58.096 19:13:44 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # IFS=: 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:58.096 19:13:44 -- accel/accel.sh@27 -- # [[ -n compare ]] 00:06:58.096 19:13:44 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.096 00:06:58.096 real 0m1.361s 00:06:58.096 user 0m1.243s 00:06:58.096 sys 0m0.130s 00:06:58.096 19:13:44 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:58.096 19:13:44 -- common/autotest_common.sh@10 -- # set +x 00:06:58.096 ************************************ 00:06:58.096 END TEST accel_compare 00:06:58.096 ************************************ 00:06:58.096 19:13:44 -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:58.096 19:13:44 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:58.096 19:13:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:58.096 19:13:44 -- common/autotest_common.sh@10 -- # set +x 00:06:58.096 ************************************ 00:06:58.096 START TEST accel_xor 00:06:58.096 ************************************ 00:06:58.096 19:13:44 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y 00:06:58.096 19:13:44 -- accel/accel.sh@16 -- # local accel_opc 00:06:58.096 19:13:44 -- accel/accel.sh@17 -- # local accel_module 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # IFS=: 00:06:58.096 19:13:44 -- accel/accel.sh@19 -- # read -r var val 00:06:58.096 19:13:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:58.096 19:13:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:58.096 19:13:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.096 19:13:44 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.096 19:13:44 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.096 19:13:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.096 19:13:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.096 19:13:44 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.096 19:13:44 -- accel/accel.sh@40 -- # local IFS=, 00:06:58.096 19:13:44 -- accel/accel.sh@41 -- # jq -r . 00:06:58.096 [2024-04-24 19:13:44.965494] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:58.096 [2024-04-24 19:13:44.965575] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616085 ] 00:06:58.096 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.096 [2024-04-24 19:13:45.042936] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.356 [2024-04-24 19:13:45.129811] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val= 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val= 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val=0x1 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val= 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val= 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val=xor 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@23 -- # accel_opc=xor 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val=2 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.356 19:13:45 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:58.356 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.356 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val= 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val=software 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@22 -- # accel_module=software 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val=32 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val=32 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val=1 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val=Yes 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val= 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:58.357 19:13:45 -- accel/accel.sh@20 -- # val= 00:06:58.357 19:13:45 -- accel/accel.sh@21 -- # case "$var" in 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # IFS=: 00:06:58.357 19:13:45 -- accel/accel.sh@19 -- # read -r var val 00:06:59.735 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.735 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.735 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.735 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.735 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.735 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.735 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.735 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.735 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.735 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.735 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.735 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.735 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.735 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.735 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.736 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.736 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:59.736 19:13:46 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:59.736 19:13:46 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.736 00:06:59.736 real 0m1.387s 00:06:59.736 user 0m1.252s 00:06:59.736 sys 0m0.148s 00:06:59.736 19:13:46 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:06:59.736 19:13:46 -- common/autotest_common.sh@10 -- # set +x 00:06:59.736 ************************************ 00:06:59.736 END TEST accel_xor 00:06:59.736 ************************************ 00:06:59.736 19:13:46 -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:59.736 19:13:46 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:59.736 19:13:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:59.736 19:13:46 -- common/autotest_common.sh@10 -- # set +x 00:06:59.736 ************************************ 00:06:59.736 START TEST accel_xor 00:06:59.736 ************************************ 00:06:59.736 19:13:46 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w xor -y -x 3 00:06:59.736 19:13:46 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.736 19:13:46 -- accel/accel.sh@17 -- # local accel_module 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:59.736 19:13:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:59.736 19:13:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.736 19:13:46 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.736 19:13:46 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.736 19:13:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.736 19:13:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.736 19:13:46 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.736 19:13:46 -- accel/accel.sh@40 -- # local IFS=, 00:06:59.736 19:13:46 -- accel/accel.sh@41 -- # jq -r . 00:06:59.736 [2024-04-24 19:13:46.541466] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:06:59.736 [2024-04-24 19:13:46.541545] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616288 ] 00:06:59.736 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.736 [2024-04-24 19:13:46.617960] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.736 [2024-04-24 19:13:46.701461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.736 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.736 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.736 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@20 -- # val=0x1 00:06:59.736 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.736 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.736 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.736 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.736 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.995 19:13:46 -- accel/accel.sh@20 -- # val=xor 00:06:59.995 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@23 -- # accel_opc=xor 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val=3 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val=software 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@22 -- # accel_module=software 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val=32 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val=32 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val=1 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val=Yes 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:06:59.996 19:13:46 -- accel/accel.sh@20 -- # val= 00:06:59.996 19:13:46 -- accel/accel.sh@21 -- # case "$var" in 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # IFS=: 00:06:59.996 19:13:46 -- accel/accel.sh@19 -- # read -r var val 00:07:00.926 19:13:47 -- accel/accel.sh@20 -- # val= 00:07:00.926 19:13:47 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # IFS=: 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # read -r var val 00:07:00.926 19:13:47 -- accel/accel.sh@20 -- # val= 00:07:00.926 19:13:47 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # IFS=: 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # read -r var val 00:07:00.926 19:13:47 -- accel/accel.sh@20 -- # val= 00:07:00.926 19:13:47 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # IFS=: 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # read -r var val 00:07:00.926 19:13:47 -- accel/accel.sh@20 -- # val= 00:07:00.926 19:13:47 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # IFS=: 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # read -r var val 00:07:00.926 19:13:47 -- accel/accel.sh@20 -- # val= 00:07:00.926 19:13:47 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # IFS=: 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # read -r var val 00:07:00.926 19:13:47 -- accel/accel.sh@20 -- # val= 00:07:00.926 19:13:47 -- accel/accel.sh@21 -- # case "$var" in 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # IFS=: 00:07:00.926 19:13:47 -- accel/accel.sh@19 -- # read -r var val 00:07:00.926 19:13:47 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:00.926 19:13:47 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:00.926 19:13:47 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.926 00:07:00.926 real 0m1.379s 00:07:00.926 user 0m1.244s 00:07:00.926 sys 0m0.147s 00:07:00.926 19:13:47 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:00.926 19:13:47 -- common/autotest_common.sh@10 -- # set +x 00:07:00.926 ************************************ 00:07:00.926 END TEST accel_xor 00:07:00.926 ************************************ 00:07:00.926 19:13:47 -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:00.926 19:13:47 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:00.926 19:13:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:00.926 19:13:47 -- common/autotest_common.sh@10 -- # set +x 00:07:01.183 ************************************ 00:07:01.183 START TEST accel_dif_verify 00:07:01.183 ************************************ 00:07:01.183 19:13:48 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_verify 00:07:01.183 19:13:48 -- accel/accel.sh@16 -- # local accel_opc 00:07:01.183 19:13:48 -- accel/accel.sh@17 -- # local accel_module 00:07:01.183 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.183 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.183 19:13:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:01.183 19:13:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:01.183 19:13:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.183 19:13:48 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.183 19:13:48 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.183 19:13:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.183 19:13:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.183 19:13:48 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:01.183 19:13:48 -- accel/accel.sh@40 -- # local IFS=, 00:07:01.183 19:13:48 -- accel/accel.sh@41 -- # jq -r . 00:07:01.183 [2024-04-24 19:13:48.105481] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:01.183 [2024-04-24 19:13:48.105560] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616495 ] 00:07:01.183 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.183 [2024-04-24 19:13:48.180762] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.441 [2024-04-24 19:13:48.264926] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val= 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val= 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val=0x1 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val= 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val= 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val=dif_verify 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val='512 bytes' 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val='8 bytes' 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val= 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val=software 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@22 -- # accel_module=software 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val=32 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val=32 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val=1 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val=No 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val= 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:01.441 19:13:48 -- accel/accel.sh@20 -- # val= 00:07:01.441 19:13:48 -- accel/accel.sh@21 -- # case "$var" in 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # IFS=: 00:07:01.441 19:13:48 -- accel/accel.sh@19 -- # read -r var val 00:07:02.810 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:02.810 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:02.810 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:02.810 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:02.810 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:02.810 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:02.810 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:02.810 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:02.810 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:02.810 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:02.810 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:02.810 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:02.810 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:02.810 19:13:49 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.810 19:13:49 -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:02.810 19:13:49 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.810 00:07:02.810 real 0m1.366s 00:07:02.810 user 0m1.231s 00:07:02.811 sys 0m0.148s 00:07:02.811 19:13:49 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:02.811 19:13:49 -- common/autotest_common.sh@10 -- # set +x 00:07:02.811 ************************************ 00:07:02.811 END TEST accel_dif_verify 00:07:02.811 ************************************ 00:07:02.811 19:13:49 -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:02.811 19:13:49 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:02.811 19:13:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.811 19:13:49 -- common/autotest_common.sh@10 -- # set +x 00:07:02.811 ************************************ 00:07:02.811 START TEST accel_dif_generate 00:07:02.811 ************************************ 00:07:02.811 19:13:49 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate 00:07:02.811 19:13:49 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.811 19:13:49 -- accel/accel.sh@17 -- # local accel_module 00:07:02.811 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:02.811 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:02.811 19:13:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:02.811 19:13:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:02.811 19:13:49 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.811 19:13:49 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.811 19:13:49 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.811 19:13:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.811 19:13:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.811 19:13:49 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:02.811 19:13:49 -- accel/accel.sh@40 -- # local IFS=, 00:07:02.811 19:13:49 -- accel/accel.sh@41 -- # jq -r . 00:07:02.811 [2024-04-24 19:13:49.653578] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:02.811 [2024-04-24 19:13:49.653672] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616758 ] 00:07:02.811 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.811 [2024-04-24 19:13:49.732148] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.811 [2024-04-24 19:13:49.814938] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val=0x1 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val=dif_generate 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val='512 bytes' 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val='8 bytes' 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val=software 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@22 -- # accel_module=software 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val=32 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val=32 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val=1 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.068 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.068 19:13:49 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:03.068 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.069 19:13:49 -- accel/accel.sh@20 -- # val=No 00:07:03.069 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.069 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:03.069 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.069 19:13:49 -- accel/accel.sh@20 -- # val= 00:07:03.069 19:13:49 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # IFS=: 00:07:03.069 19:13:49 -- accel/accel.sh@19 -- # read -r var val 00:07:03.999 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:03.999 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:03.999 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:03.999 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:03.999 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:03.999 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:03.999 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:03.999 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:03.999 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:03.999 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:03.999 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:03.999 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.256 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.256 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.256 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.256 19:13:51 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:04.256 19:13:51 -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:04.256 19:13:51 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.256 00:07:04.256 real 0m1.384s 00:07:04.256 user 0m1.246s 00:07:04.256 sys 0m0.152s 00:07:04.256 19:13:51 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:04.256 19:13:51 -- common/autotest_common.sh@10 -- # set +x 00:07:04.256 ************************************ 00:07:04.256 END TEST accel_dif_generate 00:07:04.256 ************************************ 00:07:04.256 19:13:51 -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:04.256 19:13:51 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:04.256 19:13:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.256 19:13:51 -- common/autotest_common.sh@10 -- # set +x 00:07:04.256 ************************************ 00:07:04.256 START TEST accel_dif_generate_copy 00:07:04.256 ************************************ 00:07:04.256 19:13:51 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w dif_generate_copy 00:07:04.256 19:13:51 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.256 19:13:51 -- accel/accel.sh@17 -- # local accel_module 00:07:04.256 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.256 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.256 19:13:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:04.256 19:13:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:04.256 19:13:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.256 19:13:51 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.256 19:13:51 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.256 19:13:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.256 19:13:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.256 19:13:51 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:04.256 19:13:51 -- accel/accel.sh@40 -- # local IFS=, 00:07:04.256 19:13:51 -- accel/accel.sh@41 -- # jq -r . 00:07:04.256 [2024-04-24 19:13:51.205916] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:04.256 [2024-04-24 19:13:51.205996] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617056 ] 00:07:04.256 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.514 [2024-04-24 19:13:51.281956] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.514 [2024-04-24 19:13:51.365463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val=0x1 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val=software 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@22 -- # accel_module=software 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val=32 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val=32 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val=1 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val=No 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:04.514 19:13:51 -- accel/accel.sh@20 -- # val= 00:07:04.514 19:13:51 -- accel/accel.sh@21 -- # case "$var" in 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # IFS=: 00:07:04.514 19:13:51 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:05.886 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:05.886 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:05.886 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:05.886 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:05.886 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:05.886 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:05.886 19:13:52 -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:05.886 19:13:52 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.886 00:07:05.886 real 0m1.382s 00:07:05.886 user 0m1.241s 00:07:05.886 sys 0m0.152s 00:07:05.886 19:13:52 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:05.886 19:13:52 -- common/autotest_common.sh@10 -- # set +x 00:07:05.886 ************************************ 00:07:05.886 END TEST accel_dif_generate_copy 00:07:05.886 ************************************ 00:07:05.886 19:13:52 -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:05.886 19:13:52 -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.886 19:13:52 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:05.886 19:13:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:05.886 19:13:52 -- common/autotest_common.sh@10 -- # set +x 00:07:05.886 ************************************ 00:07:05.886 START TEST accel_comp 00:07:05.886 ************************************ 00:07:05.886 19:13:52 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.886 19:13:52 -- accel/accel.sh@16 -- # local accel_opc 00:07:05.886 19:13:52 -- accel/accel.sh@17 -- # local accel_module 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:05.886 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:05.886 19:13:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.886 19:13:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:05.886 19:13:52 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.886 19:13:52 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:05.886 19:13:52 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:05.886 19:13:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.886 19:13:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.886 19:13:52 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:05.886 19:13:52 -- accel/accel.sh@40 -- # local IFS=, 00:07:05.886 19:13:52 -- accel/accel.sh@41 -- # jq -r . 00:07:05.886 [2024-04-24 19:13:52.752564] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:05.886 [2024-04-24 19:13:52.752642] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617262 ] 00:07:05.886 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.886 [2024-04-24 19:13:52.827018] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.144 [2024-04-24 19:13:52.911016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=0x1 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=compress 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@23 -- # accel_opc=compress 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=software 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@22 -- # accel_module=software 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=32 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=32 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=1 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val=No 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:06.144 19:13:52 -- accel/accel.sh@20 -- # val= 00:07:06.144 19:13:52 -- accel/accel.sh@21 -- # case "$var" in 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # IFS=: 00:07:06.144 19:13:52 -- accel/accel.sh@19 -- # read -r var val 00:07:07.513 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.513 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.513 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.513 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.513 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.513 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.513 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.513 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.513 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.513 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.513 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.513 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.513 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:07.514 19:13:54 -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:07.514 19:13:54 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.514 00:07:07.514 real 0m1.377s 00:07:07.514 user 0m1.244s 00:07:07.514 sys 0m0.146s 00:07:07.514 19:13:54 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:07.514 19:13:54 -- common/autotest_common.sh@10 -- # set +x 00:07:07.514 ************************************ 00:07:07.514 END TEST accel_comp 00:07:07.514 ************************************ 00:07:07.514 19:13:54 -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:07.514 19:13:54 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:07.514 19:13:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:07.514 19:13:54 -- common/autotest_common.sh@10 -- # set +x 00:07:07.514 ************************************ 00:07:07.514 START TEST accel_decomp 00:07:07.514 ************************************ 00:07:07.514 19:13:54 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:07.514 19:13:54 -- accel/accel.sh@16 -- # local accel_opc 00:07:07.514 19:13:54 -- accel/accel.sh@17 -- # local accel_module 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:07.514 19:13:54 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.514 19:13:54 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:07.514 19:13:54 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:07.514 19:13:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.514 19:13:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.514 19:13:54 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:07.514 19:13:54 -- accel/accel.sh@40 -- # local IFS=, 00:07:07.514 19:13:54 -- accel/accel.sh@41 -- # jq -r . 00:07:07.514 [2024-04-24 19:13:54.308770] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:07.514 [2024-04-24 19:13:54.308857] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617470 ] 00:07:07.514 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.514 [2024-04-24 19:13:54.382662] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.514 [2024-04-24 19:13:54.465148] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=0x1 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=decompress 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=software 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@22 -- # accel_module=software 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=32 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=32 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=1 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val=Yes 00:07:07.514 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.514 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.514 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.772 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.772 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.772 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:07.772 19:13:54 -- accel/accel.sh@20 -- # val= 00:07:07.772 19:13:54 -- accel/accel.sh@21 -- # case "$var" in 00:07:07.772 19:13:54 -- accel/accel.sh@19 -- # IFS=: 00:07:07.772 19:13:54 -- accel/accel.sh@19 -- # read -r var val 00:07:08.702 19:13:55 -- accel/accel.sh@20 -- # val= 00:07:08.702 19:13:55 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # IFS=: 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # read -r var val 00:07:08.702 19:13:55 -- accel/accel.sh@20 -- # val= 00:07:08.702 19:13:55 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # IFS=: 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # read -r var val 00:07:08.702 19:13:55 -- accel/accel.sh@20 -- # val= 00:07:08.702 19:13:55 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # IFS=: 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # read -r var val 00:07:08.702 19:13:55 -- accel/accel.sh@20 -- # val= 00:07:08.702 19:13:55 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # IFS=: 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # read -r var val 00:07:08.702 19:13:55 -- accel/accel.sh@20 -- # val= 00:07:08.702 19:13:55 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # IFS=: 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # read -r var val 00:07:08.702 19:13:55 -- accel/accel.sh@20 -- # val= 00:07:08.702 19:13:55 -- accel/accel.sh@21 -- # case "$var" in 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # IFS=: 00:07:08.702 19:13:55 -- accel/accel.sh@19 -- # read -r var val 00:07:08.702 19:13:55 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:08.702 19:13:55 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:08.702 19:13:55 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.702 00:07:08.702 real 0m1.367s 00:07:08.702 user 0m1.233s 00:07:08.702 sys 0m0.147s 00:07:08.702 19:13:55 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:08.702 19:13:55 -- common/autotest_common.sh@10 -- # set +x 00:07:08.702 ************************************ 00:07:08.702 END TEST accel_decomp 00:07:08.702 ************************************ 00:07:08.702 19:13:55 -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.702 19:13:55 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:08.702 19:13:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:08.702 19:13:55 -- common/autotest_common.sh@10 -- # set +x 00:07:08.959 ************************************ 00:07:08.959 START TEST accel_decmop_full 00:07:08.959 ************************************ 00:07:08.959 19:13:55 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.959 19:13:55 -- accel/accel.sh@16 -- # local accel_opc 00:07:08.959 19:13:55 -- accel/accel.sh@17 -- # local accel_module 00:07:08.959 19:13:55 -- accel/accel.sh@19 -- # IFS=: 00:07:08.959 19:13:55 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.959 19:13:55 -- accel/accel.sh@19 -- # read -r var val 00:07:08.959 19:13:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:08.959 19:13:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.959 19:13:55 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:08.959 19:13:55 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:08.959 19:13:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.959 19:13:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.959 19:13:55 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:08.959 19:13:55 -- accel/accel.sh@40 -- # local IFS=, 00:07:08.959 19:13:55 -- accel/accel.sh@41 -- # jq -r . 00:07:08.959 [2024-04-24 19:13:55.857509] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:08.959 [2024-04-24 19:13:55.857575] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617672 ] 00:07:08.959 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.959 [2024-04-24 19:13:55.934191] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.216 [2024-04-24 19:13:56.017486] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.216 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.216 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.216 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.216 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.216 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.216 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.216 19:13:56 -- accel/accel.sh@20 -- # val=0x1 00:07:09.216 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.216 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.216 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.216 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val=decompress 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val=software 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@22 -- # accel_module=software 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val=32 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val=32 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val=1 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val=Yes 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:09.217 19:13:56 -- accel/accel.sh@20 -- # val= 00:07:09.217 19:13:56 -- accel/accel.sh@21 -- # case "$var" in 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # IFS=: 00:07:09.217 19:13:56 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.589 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.589 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.589 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.589 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.589 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.589 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:10.589 19:13:57 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:10.589 19:13:57 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.589 00:07:10.589 real 0m1.387s 00:07:10.589 user 0m1.256s 00:07:10.589 sys 0m0.143s 00:07:10.589 19:13:57 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:10.589 19:13:57 -- common/autotest_common.sh@10 -- # set +x 00:07:10.589 ************************************ 00:07:10.589 END TEST accel_decmop_full 00:07:10.589 ************************************ 00:07:10.589 19:13:57 -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.589 19:13:57 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:10.589 19:13:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:10.589 19:13:57 -- common/autotest_common.sh@10 -- # set +x 00:07:10.589 ************************************ 00:07:10.589 START TEST accel_decomp_mcore 00:07:10.589 ************************************ 00:07:10.589 19:13:57 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.589 19:13:57 -- accel/accel.sh@16 -- # local accel_opc 00:07:10.589 19:13:57 -- accel/accel.sh@17 -- # local accel_module 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.589 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.589 19:13:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.589 19:13:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:10.589 19:13:57 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.589 19:13:57 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:10.589 19:13:57 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:10.589 19:13:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.589 19:13:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.589 19:13:57 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:10.589 19:13:57 -- accel/accel.sh@40 -- # local IFS=, 00:07:10.589 19:13:57 -- accel/accel.sh@41 -- # jq -r . 00:07:10.589 [2024-04-24 19:13:57.432293] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:10.589 [2024-04-24 19:13:57.432371] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617878 ] 00:07:10.589 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.589 [2024-04-24 19:13:57.510146] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:10.589 [2024-04-24 19:13:57.600711] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:10.589 [2024-04-24 19:13:57.600798] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:10.589 [2024-04-24 19:13:57.600817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:10.590 [2024-04-24 19:13:57.600819] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=0xf 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=decompress 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=software 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@22 -- # accel_module=software 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=32 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=32 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=1 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val=Yes 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:10.848 19:13:57 -- accel/accel.sh@20 -- # val= 00:07:10.848 19:13:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # IFS=: 00:07:10.848 19:13:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:12.223 19:13:58 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:12.223 19:13:58 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.223 00:07:12.223 real 0m1.406s 00:07:12.223 user 0m4.627s 00:07:12.223 sys 0m0.169s 00:07:12.223 19:13:58 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:12.223 19:13:58 -- common/autotest_common.sh@10 -- # set +x 00:07:12.223 ************************************ 00:07:12.223 END TEST accel_decomp_mcore 00:07:12.223 ************************************ 00:07:12.223 19:13:58 -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:12.223 19:13:58 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:12.223 19:13:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:12.223 19:13:58 -- common/autotest_common.sh@10 -- # set +x 00:07:12.223 ************************************ 00:07:12.223 START TEST accel_decomp_full_mcore 00:07:12.223 ************************************ 00:07:12.223 19:13:58 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:12.223 19:13:58 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.223 19:13:58 -- accel/accel.sh@17 -- # local accel_module 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:58 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:12.223 19:13:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:12.223 19:13:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.223 19:13:58 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.223 19:13:58 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.223 19:13:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.223 19:13:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.223 19:13:58 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.223 19:13:58 -- accel/accel.sh@40 -- # local IFS=, 00:07:12.223 19:13:58 -- accel/accel.sh@41 -- # jq -r . 00:07:12.223 [2024-04-24 19:13:59.011197] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:12.223 [2024-04-24 19:13:59.011281] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618099 ] 00:07:12.223 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.223 [2024-04-24 19:13:59.089005] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:12.223 [2024-04-24 19:13:59.173614] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.223 [2024-04-24 19:13:59.173712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:12.223 [2024-04-24 19:13:59.173793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:12.223 [2024-04-24 19:13:59.173794] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val=0xf 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val=decompress 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.223 19:13:59 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.223 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.223 19:13:59 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:12.223 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val=software 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@22 -- # accel_module=software 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val=32 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val=32 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val=1 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val=Yes 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:12.224 19:13:59 -- accel/accel.sh@20 -- # val= 00:07:12.224 19:13:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # IFS=: 00:07:12.224 19:13:59 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.598 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:13.598 19:14:00 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:13.598 19:14:00 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:13.598 00:07:13.598 real 0m1.400s 00:07:13.598 user 0m4.638s 00:07:13.598 sys 0m0.151s 00:07:13.598 19:14:00 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:13.598 19:14:00 -- common/autotest_common.sh@10 -- # set +x 00:07:13.598 ************************************ 00:07:13.598 END TEST accel_decomp_full_mcore 00:07:13.598 ************************************ 00:07:13.598 19:14:00 -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.598 19:14:00 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:13.598 19:14:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:13.598 19:14:00 -- common/autotest_common.sh@10 -- # set +x 00:07:13.598 ************************************ 00:07:13.598 START TEST accel_decomp_mthread 00:07:13.598 ************************************ 00:07:13.598 19:14:00 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.598 19:14:00 -- accel/accel.sh@16 -- # local accel_opc 00:07:13.598 19:14:00 -- accel/accel.sh@17 -- # local accel_module 00:07:13.598 19:14:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.598 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.598 19:14:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.598 19:14:00 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.598 19:14:00 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:13.598 19:14:00 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:13.598 19:14:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.598 19:14:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.598 19:14:00 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:13.598 19:14:00 -- accel/accel.sh@40 -- # local IFS=, 00:07:13.598 19:14:00 -- accel/accel.sh@41 -- # jq -r . 00:07:13.598 [2024-04-24 19:14:00.555542] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:13.598 [2024-04-24 19:14:00.555608] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618413 ] 00:07:13.598 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.857 [2024-04-24 19:14:00.630622] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.857 [2024-04-24 19:14:00.713523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=0x1 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=decompress 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=software 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@22 -- # accel_module=software 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=32 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=32 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=2 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val=Yes 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:13.857 19:14:00 -- accel/accel.sh@20 -- # val= 00:07:13.857 19:14:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # IFS=: 00:07:13.857 19:14:00 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@20 -- # val= 00:07:15.230 19:14:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@20 -- # val= 00:07:15.230 19:14:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@20 -- # val= 00:07:15.230 19:14:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@20 -- # val= 00:07:15.230 19:14:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@20 -- # val= 00:07:15.230 19:14:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@20 -- # val= 00:07:15.230 19:14:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@20 -- # val= 00:07:15.230 19:14:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:01 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:01 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:15.230 19:14:01 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:15.230 19:14:01 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.230 00:07:15.230 real 0m1.379s 00:07:15.230 user 0m1.247s 00:07:15.230 sys 0m0.145s 00:07:15.230 19:14:01 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:15.230 19:14:01 -- common/autotest_common.sh@10 -- # set +x 00:07:15.230 ************************************ 00:07:15.230 END TEST accel_decomp_mthread 00:07:15.230 ************************************ 00:07:15.230 19:14:01 -- accel/accel.sh@122 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:15.230 19:14:01 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:15.230 19:14:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:15.230 19:14:01 -- common/autotest_common.sh@10 -- # set +x 00:07:15.230 ************************************ 00:07:15.230 START TEST accel_deomp_full_mthread 00:07:15.230 ************************************ 00:07:15.230 19:14:02 -- common/autotest_common.sh@1111 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:15.230 19:14:02 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.230 19:14:02 -- accel/accel.sh@17 -- # local accel_module 00:07:15.230 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.230 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.230 19:14:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:15.230 19:14:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:15.230 19:14:02 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.230 19:14:02 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.230 19:14:02 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.231 19:14:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.231 19:14:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.231 19:14:02 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.231 19:14:02 -- accel/accel.sh@40 -- # local IFS=, 00:07:15.231 19:14:02 -- accel/accel.sh@41 -- # jq -r . 00:07:15.231 [2024-04-24 19:14:02.103013] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:15.231 [2024-04-24 19:14:02.103115] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618650 ] 00:07:15.231 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.231 [2024-04-24 19:14:02.179493] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.489 [2024-04-24 19:14:02.263174] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=0x1 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=decompress 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=software 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@22 -- # accel_module=software 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=32 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=32 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=2 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val=Yes 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:15.489 19:14:02 -- accel/accel.sh@20 -- # val= 00:07:15.489 19:14:02 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # IFS=: 00:07:15.489 19:14:02 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@20 -- # val= 00:07:16.862 19:14:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # IFS=: 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@20 -- # val= 00:07:16.862 19:14:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # IFS=: 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@20 -- # val= 00:07:16.862 19:14:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # IFS=: 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@20 -- # val= 00:07:16.862 19:14:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # IFS=: 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@20 -- # val= 00:07:16.862 19:14:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # IFS=: 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@20 -- # val= 00:07:16.862 19:14:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # IFS=: 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@20 -- # val= 00:07:16.862 19:14:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # IFS=: 00:07:16.862 19:14:03 -- accel/accel.sh@19 -- # read -r var val 00:07:16.862 19:14:03 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:16.862 19:14:03 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:16.862 19:14:03 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:16.862 00:07:16.862 real 0m1.405s 00:07:16.862 user 0m1.272s 00:07:16.862 sys 0m0.146s 00:07:16.862 19:14:03 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:16.862 19:14:03 -- common/autotest_common.sh@10 -- # set +x 00:07:16.862 ************************************ 00:07:16.862 END TEST accel_deomp_full_mthread 00:07:16.862 ************************************ 00:07:16.862 19:14:03 -- accel/accel.sh@124 -- # [[ n == y ]] 00:07:16.862 19:14:03 -- accel/accel.sh@137 -- # build_accel_config 00:07:16.862 19:14:03 -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:16.862 19:14:03 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:16.862 19:14:03 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:16.862 19:14:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.862 19:14:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.862 19:14:03 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:16.862 19:14:03 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:16.862 19:14:03 -- accel/accel.sh@40 -- # local IFS=, 00:07:16.862 19:14:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:16.862 19:14:03 -- accel/accel.sh@41 -- # jq -r . 00:07:16.862 19:14:03 -- common/autotest_common.sh@10 -- # set +x 00:07:16.862 ************************************ 00:07:16.862 START TEST accel_dif_functional_tests 00:07:16.862 ************************************ 00:07:16.862 19:14:03 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:16.862 [2024-04-24 19:14:03.682899] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:16.862 [2024-04-24 19:14:03.682977] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618858 ] 00:07:16.862 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.862 [2024-04-24 19:14:03.757257] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:16.862 [2024-04-24 19:14:03.841340] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.862 [2024-04-24 19:14:03.841429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:16.862 [2024-04-24 19:14:03.841431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.120 00:07:17.120 00:07:17.120 CUnit - A unit testing framework for C - Version 2.1-3 00:07:17.120 http://cunit.sourceforge.net/ 00:07:17.120 00:07:17.120 00:07:17.120 Suite: accel_dif 00:07:17.120 Test: verify: DIF generated, GUARD check ...passed 00:07:17.120 Test: verify: DIF generated, APPTAG check ...passed 00:07:17.120 Test: verify: DIF generated, REFTAG check ...passed 00:07:17.120 Test: verify: DIF not generated, GUARD check ...[2024-04-24 19:14:03.919727] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:17.120 [2024-04-24 19:14:03.919782] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:17.120 passed 00:07:17.120 Test: verify: DIF not generated, APPTAG check ...[2024-04-24 19:14:03.919831] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:17.120 [2024-04-24 19:14:03.919851] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:17.120 passed 00:07:17.120 Test: verify: DIF not generated, REFTAG check ...[2024-04-24 19:14:03.919872] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:17.120 [2024-04-24 19:14:03.919892] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:17.120 passed 00:07:17.120 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:17.120 Test: verify: APPTAG incorrect, APPTAG check ...[2024-04-24 19:14:03.919938] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:17.120 passed 00:07:17.120 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:17.120 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:17.120 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:17.120 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-04-24 19:14:03.920041] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:17.120 passed 00:07:17.120 Test: generate copy: DIF generated, GUARD check ...passed 00:07:17.120 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:17.120 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:17.120 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:17.120 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:17.120 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:17.120 Test: generate copy: iovecs-len validate ...[2024-04-24 19:14:03.920238] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:17.120 passed 00:07:17.121 Test: generate copy: buffer alignment validate ...passed 00:07:17.121 00:07:17.121 Run Summary: Type Total Ran Passed Failed Inactive 00:07:17.121 suites 1 1 n/a 0 0 00:07:17.121 tests 20 20 20 0 0 00:07:17.121 asserts 204 204 204 0 n/a 00:07:17.121 00:07:17.121 Elapsed time = 0.002 seconds 00:07:17.121 00:07:17.121 real 0m0.445s 00:07:17.121 user 0m0.638s 00:07:17.121 sys 0m0.168s 00:07:17.121 19:14:04 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:17.121 19:14:04 -- common/autotest_common.sh@10 -- # set +x 00:07:17.121 ************************************ 00:07:17.121 END TEST accel_dif_functional_tests 00:07:17.121 ************************************ 00:07:17.379 00:07:17.379 real 0m35.070s 00:07:17.379 user 0m36.134s 00:07:17.379 sys 0m6.602s 00:07:17.379 19:14:04 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:17.379 19:14:04 -- common/autotest_common.sh@10 -- # set +x 00:07:17.379 ************************************ 00:07:17.379 END TEST accel 00:07:17.379 ************************************ 00:07:17.379 19:14:04 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:17.379 19:14:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:17.379 19:14:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:17.379 19:14:04 -- common/autotest_common.sh@10 -- # set +x 00:07:17.379 ************************************ 00:07:17.379 START TEST accel_rpc 00:07:17.379 ************************************ 00:07:17.379 19:14:04 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:17.637 * Looking for test storage... 00:07:17.637 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:17.637 19:14:04 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:17.637 19:14:04 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1618992 00:07:17.637 19:14:04 -- accel/accel_rpc.sh@15 -- # waitforlisten 1618992 00:07:17.637 19:14:04 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:17.637 19:14:04 -- common/autotest_common.sh@817 -- # '[' -z 1618992 ']' 00:07:17.637 19:14:04 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.637 19:14:04 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:17.637 19:14:04 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.637 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.637 19:14:04 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:17.637 19:14:04 -- common/autotest_common.sh@10 -- # set +x 00:07:17.637 [2024-04-24 19:14:04.463626] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:17.637 [2024-04-24 19:14:04.463703] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618992 ] 00:07:17.637 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.637 [2024-04-24 19:14:04.540012] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.637 [2024-04-24 19:14:04.632676] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.567 19:14:05 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:18.567 19:14:05 -- common/autotest_common.sh@850 -- # return 0 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:18.567 19:14:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:18.567 19:14:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:18.567 19:14:05 -- common/autotest_common.sh@10 -- # set +x 00:07:18.567 ************************************ 00:07:18.567 START TEST accel_assign_opcode 00:07:18.567 ************************************ 00:07:18.567 19:14:05 -- common/autotest_common.sh@1111 -- # accel_assign_opcode_test_suite 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:18.567 19:14:05 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:18.567 19:14:05 -- common/autotest_common.sh@10 -- # set +x 00:07:18.567 [2024-04-24 19:14:05.415018] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:18.567 19:14:05 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:18.567 19:14:05 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:18.567 19:14:05 -- common/autotest_common.sh@10 -- # set +x 00:07:18.567 [2024-04-24 19:14:05.423027] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:18.567 19:14:05 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:18.567 19:14:05 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:18.567 19:14:05 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:18.567 19:14:05 -- common/autotest_common.sh@10 -- # set +x 00:07:18.824 19:14:05 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:18.824 19:14:05 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:18.824 19:14:05 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:18.824 19:14:05 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:18.824 19:14:05 -- common/autotest_common.sh@10 -- # set +x 00:07:18.824 19:14:05 -- accel/accel_rpc.sh@42 -- # grep software 00:07:18.824 19:14:05 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:18.824 software 00:07:18.824 00:07:18.824 real 0m0.236s 00:07:18.824 user 0m0.045s 00:07:18.824 sys 0m0.013s 00:07:18.824 19:14:05 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:18.824 19:14:05 -- common/autotest_common.sh@10 -- # set +x 00:07:18.824 ************************************ 00:07:18.824 END TEST accel_assign_opcode 00:07:18.824 ************************************ 00:07:18.824 19:14:05 -- accel/accel_rpc.sh@55 -- # killprocess 1618992 00:07:18.824 19:14:05 -- common/autotest_common.sh@936 -- # '[' -z 1618992 ']' 00:07:18.824 19:14:05 -- common/autotest_common.sh@940 -- # kill -0 1618992 00:07:18.824 19:14:05 -- common/autotest_common.sh@941 -- # uname 00:07:18.824 19:14:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:18.825 19:14:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1618992 00:07:18.825 19:14:05 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:18.825 19:14:05 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:18.825 19:14:05 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1618992' 00:07:18.825 killing process with pid 1618992 00:07:18.825 19:14:05 -- common/autotest_common.sh@955 -- # kill 1618992 00:07:18.825 19:14:05 -- common/autotest_common.sh@960 -- # wait 1618992 00:07:19.082 00:07:19.082 real 0m1.739s 00:07:19.082 user 0m1.779s 00:07:19.082 sys 0m0.548s 00:07:19.082 19:14:06 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:19.082 19:14:06 -- common/autotest_common.sh@10 -- # set +x 00:07:19.082 ************************************ 00:07:19.082 END TEST accel_rpc 00:07:19.082 ************************************ 00:07:19.338 19:14:06 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:19.339 19:14:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:19.339 19:14:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:19.339 19:14:06 -- common/autotest_common.sh@10 -- # set +x 00:07:19.339 ************************************ 00:07:19.339 START TEST app_cmdline 00:07:19.339 ************************************ 00:07:19.339 19:14:06 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:19.596 * Looking for test storage... 00:07:19.596 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:19.596 19:14:06 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:19.596 19:14:06 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1619366 00:07:19.596 19:14:06 -- app/cmdline.sh@18 -- # waitforlisten 1619366 00:07:19.596 19:14:06 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:19.596 19:14:06 -- common/autotest_common.sh@817 -- # '[' -z 1619366 ']' 00:07:19.596 19:14:06 -- common/autotest_common.sh@821 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.596 19:14:06 -- common/autotest_common.sh@822 -- # local max_retries=100 00:07:19.596 19:14:06 -- common/autotest_common.sh@824 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.596 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.596 19:14:06 -- common/autotest_common.sh@826 -- # xtrace_disable 00:07:19.596 19:14:06 -- common/autotest_common.sh@10 -- # set +x 00:07:19.596 [2024-04-24 19:14:06.428480] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:19.596 [2024-04-24 19:14:06.428550] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1619366 ] 00:07:19.596 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.596 [2024-04-24 19:14:06.502984] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.596 [2024-04-24 19:14:06.584367] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.646 19:14:07 -- common/autotest_common.sh@846 -- # (( i == 0 )) 00:07:20.646 19:14:07 -- common/autotest_common.sh@850 -- # return 0 00:07:20.646 19:14:07 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:20.646 { 00:07:20.646 "version": "SPDK v24.05-pre git sha1 5c8d451f1", 00:07:20.646 "fields": { 00:07:20.646 "major": 24, 00:07:20.646 "minor": 5, 00:07:20.646 "patch": 0, 00:07:20.646 "suffix": "-pre", 00:07:20.646 "commit": "5c8d451f1" 00:07:20.646 } 00:07:20.646 } 00:07:20.646 19:14:07 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:20.646 19:14:07 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:20.646 19:14:07 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:20.646 19:14:07 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:20.646 19:14:07 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:20.646 19:14:07 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:20.646 19:14:07 -- common/autotest_common.sh@549 -- # xtrace_disable 00:07:20.646 19:14:07 -- common/autotest_common.sh@10 -- # set +x 00:07:20.646 19:14:07 -- app/cmdline.sh@26 -- # sort 00:07:20.646 19:14:07 -- common/autotest_common.sh@577 -- # [[ 0 == 0 ]] 00:07:20.646 19:14:07 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:20.646 19:14:07 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:20.646 19:14:07 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:20.646 19:14:07 -- common/autotest_common.sh@638 -- # local es=0 00:07:20.646 19:14:07 -- common/autotest_common.sh@640 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:20.646 19:14:07 -- common/autotest_common.sh@626 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:20.646 19:14:07 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:20.646 19:14:07 -- common/autotest_common.sh@630 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:20.646 19:14:07 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:20.646 19:14:07 -- common/autotest_common.sh@632 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:20.646 19:14:07 -- common/autotest_common.sh@630 -- # case "$(type -t "$arg")" in 00:07:20.646 19:14:07 -- common/autotest_common.sh@632 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:20.646 19:14:07 -- common/autotest_common.sh@632 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:20.646 19:14:07 -- common/autotest_common.sh@641 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:20.646 request: 00:07:20.646 { 00:07:20.646 "method": "env_dpdk_get_mem_stats", 00:07:20.646 "req_id": 1 00:07:20.646 } 00:07:20.646 Got JSON-RPC error response 00:07:20.646 response: 00:07:20.646 { 00:07:20.646 "code": -32601, 00:07:20.646 "message": "Method not found" 00:07:20.646 } 00:07:20.904 19:14:07 -- common/autotest_common.sh@641 -- # es=1 00:07:20.904 19:14:07 -- common/autotest_common.sh@649 -- # (( es > 128 )) 00:07:20.904 19:14:07 -- common/autotest_common.sh@660 -- # [[ -n '' ]] 00:07:20.904 19:14:07 -- common/autotest_common.sh@665 -- # (( !es == 0 )) 00:07:20.904 19:14:07 -- app/cmdline.sh@1 -- # killprocess 1619366 00:07:20.904 19:14:07 -- common/autotest_common.sh@936 -- # '[' -z 1619366 ']' 00:07:20.905 19:14:07 -- common/autotest_common.sh@940 -- # kill -0 1619366 00:07:20.905 19:14:07 -- common/autotest_common.sh@941 -- # uname 00:07:20.905 19:14:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:20.905 19:14:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1619366 00:07:20.905 19:14:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:20.905 19:14:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:20.905 19:14:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1619366' 00:07:20.905 killing process with pid 1619366 00:07:20.905 19:14:07 -- common/autotest_common.sh@955 -- # kill 1619366 00:07:20.905 19:14:07 -- common/autotest_common.sh@960 -- # wait 1619366 00:07:21.163 00:07:21.163 real 0m1.735s 00:07:21.163 user 0m2.008s 00:07:21.163 sys 0m0.502s 00:07:21.163 19:14:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:21.163 19:14:08 -- common/autotest_common.sh@10 -- # set +x 00:07:21.163 ************************************ 00:07:21.163 END TEST app_cmdline 00:07:21.163 ************************************ 00:07:21.163 19:14:08 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:21.163 19:14:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:21.163 19:14:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.163 19:14:08 -- common/autotest_common.sh@10 -- # set +x 00:07:21.420 ************************************ 00:07:21.420 START TEST version 00:07:21.420 ************************************ 00:07:21.420 19:14:08 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:21.420 * Looking for test storage... 00:07:21.420 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:21.420 19:14:08 -- app/version.sh@17 -- # get_header_version major 00:07:21.420 19:14:08 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:21.420 19:14:08 -- app/version.sh@14 -- # tr -d '"' 00:07:21.420 19:14:08 -- app/version.sh@14 -- # cut -f2 00:07:21.420 19:14:08 -- app/version.sh@17 -- # major=24 00:07:21.420 19:14:08 -- app/version.sh@18 -- # get_header_version minor 00:07:21.420 19:14:08 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:21.420 19:14:08 -- app/version.sh@14 -- # tr -d '"' 00:07:21.420 19:14:08 -- app/version.sh@14 -- # cut -f2 00:07:21.420 19:14:08 -- app/version.sh@18 -- # minor=5 00:07:21.420 19:14:08 -- app/version.sh@19 -- # get_header_version patch 00:07:21.420 19:14:08 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:21.420 19:14:08 -- app/version.sh@14 -- # cut -f2 00:07:21.420 19:14:08 -- app/version.sh@14 -- # tr -d '"' 00:07:21.420 19:14:08 -- app/version.sh@19 -- # patch=0 00:07:21.420 19:14:08 -- app/version.sh@20 -- # get_header_version suffix 00:07:21.420 19:14:08 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:21.420 19:14:08 -- app/version.sh@14 -- # cut -f2 00:07:21.420 19:14:08 -- app/version.sh@14 -- # tr -d '"' 00:07:21.420 19:14:08 -- app/version.sh@20 -- # suffix=-pre 00:07:21.420 19:14:08 -- app/version.sh@22 -- # version=24.5 00:07:21.420 19:14:08 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:21.420 19:14:08 -- app/version.sh@28 -- # version=24.5rc0 00:07:21.420 19:14:08 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:21.420 19:14:08 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:21.420 19:14:08 -- app/version.sh@30 -- # py_version=24.5rc0 00:07:21.420 19:14:08 -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:07:21.420 00:07:21.420 real 0m0.184s 00:07:21.420 user 0m0.085s 00:07:21.420 sys 0m0.145s 00:07:21.420 19:14:08 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:07:21.420 19:14:08 -- common/autotest_common.sh@10 -- # set +x 00:07:21.420 ************************************ 00:07:21.420 END TEST version 00:07:21.420 ************************************ 00:07:21.678 19:14:08 -- spdk/autotest.sh@184 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@194 -- # uname -s 00:07:21.678 19:14:08 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:21.678 19:14:08 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:21.678 19:14:08 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:21.678 19:14:08 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@254 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@258 -- # timing_exit lib 00:07:21.678 19:14:08 -- common/autotest_common.sh@716 -- # xtrace_disable 00:07:21.678 19:14:08 -- common/autotest_common.sh@10 -- # set +x 00:07:21.678 19:14:08 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@268 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@277 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@310 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@314 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@328 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@341 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@345 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@354 -- # '[' 0 -eq 1 ']' 00:07:21.678 19:14:08 -- spdk/autotest.sh@361 -- # [[ 0 -eq 1 ]] 00:07:21.678 19:14:08 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:07:21.678 19:14:08 -- spdk/autotest.sh@369 -- # [[ 1 -eq 1 ]] 00:07:21.678 19:14:08 -- spdk/autotest.sh@370 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:21.678 19:14:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:21.678 19:14:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.678 19:14:08 -- common/autotest_common.sh@10 -- # set +x 00:07:21.678 ************************************ 00:07:21.678 START TEST llvm_fuzz 00:07:21.678 ************************************ 00:07:21.678 19:14:08 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:21.936 * Looking for test storage... 00:07:21.936 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:21.936 19:14:08 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:21.936 19:14:08 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:21.936 19:14:08 -- common/autotest_common.sh@536 -- # fuzzers=() 00:07:21.936 19:14:08 -- common/autotest_common.sh@536 -- # local fuzzers 00:07:21.936 19:14:08 -- common/autotest_common.sh@538 -- # [[ -n '' ]] 00:07:21.936 19:14:08 -- common/autotest_common.sh@541 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:21.936 19:14:08 -- common/autotest_common.sh@542 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:21.936 19:14:08 -- common/autotest_common.sh@545 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:21.936 19:14:08 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:21.936 19:14:08 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:21.936 19:14:08 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:21.936 19:14:08 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:21.936 19:14:08 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:21.936 19:14:08 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:21.936 19:14:08 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:21.936 19:14:08 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:21.936 19:14:08 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:21.936 19:14:08 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:21.936 19:14:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:21.936 19:14:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.936 19:14:08 -- common/autotest_common.sh@10 -- # set +x 00:07:21.936 ************************************ 00:07:21.936 START TEST nvmf_fuzz 00:07:21.936 ************************************ 00:07:21.936 19:14:08 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:21.936 * Looking for test storage... 00:07:21.936 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:21.936 19:14:08 -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:21.936 19:14:08 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:21.936 19:14:08 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:21.936 19:14:08 -- common/autotest_common.sh@34 -- # set -e 00:07:21.936 19:14:08 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:21.936 19:14:08 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:21.936 19:14:08 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:21.936 19:14:08 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:21.936 19:14:08 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:21.936 19:14:08 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:21.936 19:14:08 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:21.936 19:14:08 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:21.936 19:14:08 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:21.936 19:14:08 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:21.936 19:14:08 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:21.936 19:14:08 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:21.936 19:14:08 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:21.936 19:14:08 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:21.936 19:14:08 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:21.936 19:14:08 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:21.936 19:14:08 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:21.936 19:14:08 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:21.936 19:14:08 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:21.936 19:14:08 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:21.936 19:14:08 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:21.936 19:14:08 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:21.936 19:14:08 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:21.936 19:14:08 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:21.936 19:14:08 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:21.936 19:14:08 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:21.936 19:14:08 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:21.936 19:14:08 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:21.936 19:14:08 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:21.936 19:14:08 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:21.936 19:14:08 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:21.936 19:14:08 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:21.936 19:14:08 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:21.936 19:14:08 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:21.936 19:14:08 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:21.936 19:14:08 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:21.936 19:14:08 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:21.936 19:14:08 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:21.936 19:14:08 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:21.936 19:14:08 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:21.936 19:14:08 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:21.936 19:14:08 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:21.936 19:14:08 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:21.936 19:14:08 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:21.936 19:14:08 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:21.936 19:14:08 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:21.936 19:14:08 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:21.936 19:14:08 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:21.936 19:14:08 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:21.936 19:14:08 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:21.936 19:14:08 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:21.936 19:14:08 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:21.936 19:14:08 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:21.936 19:14:08 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:21.936 19:14:08 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:21.936 19:14:08 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:21.936 19:14:08 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:21.936 19:14:08 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:07:21.936 19:14:08 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:07:21.936 19:14:08 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:07:21.936 19:14:08 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:07:21.936 19:14:08 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:07:21.936 19:14:08 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:07:21.936 19:14:08 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:07:22.196 19:14:08 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:07:22.196 19:14:08 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:07:22.196 19:14:08 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:07:22.196 19:14:08 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:07:22.196 19:14:08 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:07:22.196 19:14:08 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:07:22.196 19:14:08 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:07:22.196 19:14:08 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:07:22.196 19:14:08 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:22.196 19:14:08 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:07:22.196 19:14:08 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:07:22.196 19:14:08 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:07:22.196 19:14:08 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:07:22.196 19:14:08 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:07:22.196 19:14:08 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:07:22.196 19:14:08 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:07:22.196 19:14:08 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:07:22.196 19:14:08 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:07:22.196 19:14:08 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:07:22.196 19:14:08 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:07:22.196 19:14:08 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:22.196 19:14:08 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:07:22.196 19:14:08 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:07:22.196 19:14:08 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:22.196 19:14:08 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:22.196 19:14:08 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:22.196 19:14:08 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:22.196 19:14:08 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:22.196 19:14:08 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:22.196 19:14:08 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:22.196 19:14:08 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:22.196 19:14:08 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:22.196 19:14:08 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:22.196 19:14:08 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:22.196 19:14:08 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:22.196 19:14:08 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:22.196 19:14:08 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:22.196 19:14:08 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:22.196 19:14:08 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:22.196 #define SPDK_CONFIG_H 00:07:22.196 #define SPDK_CONFIG_APPS 1 00:07:22.196 #define SPDK_CONFIG_ARCH native 00:07:22.196 #undef SPDK_CONFIG_ASAN 00:07:22.196 #undef SPDK_CONFIG_AVAHI 00:07:22.196 #undef SPDK_CONFIG_CET 00:07:22.196 #define SPDK_CONFIG_COVERAGE 1 00:07:22.196 #define SPDK_CONFIG_CROSS_PREFIX 00:07:22.197 #undef SPDK_CONFIG_CRYPTO 00:07:22.197 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:22.197 #undef SPDK_CONFIG_CUSTOMOCF 00:07:22.197 #undef SPDK_CONFIG_DAOS 00:07:22.197 #define SPDK_CONFIG_DAOS_DIR 00:07:22.197 #define SPDK_CONFIG_DEBUG 1 00:07:22.197 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:22.197 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:22.197 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:22.197 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:22.197 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:22.197 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:22.197 #define SPDK_CONFIG_EXAMPLES 1 00:07:22.197 #undef SPDK_CONFIG_FC 00:07:22.197 #define SPDK_CONFIG_FC_PATH 00:07:22.197 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:22.197 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:22.197 #undef SPDK_CONFIG_FUSE 00:07:22.197 #define SPDK_CONFIG_FUZZER 1 00:07:22.197 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:22.197 #undef SPDK_CONFIG_GOLANG 00:07:22.197 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:22.197 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:22.197 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:22.197 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:07:22.197 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:22.197 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:22.197 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:22.197 #define SPDK_CONFIG_IDXD 1 00:07:22.197 #undef SPDK_CONFIG_IDXD_KERNEL 00:07:22.197 #undef SPDK_CONFIG_IPSEC_MB 00:07:22.197 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:22.197 #define SPDK_CONFIG_ISAL 1 00:07:22.197 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:22.197 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:22.197 #define SPDK_CONFIG_LIBDIR 00:07:22.197 #undef SPDK_CONFIG_LTO 00:07:22.197 #define SPDK_CONFIG_MAX_LCORES 00:07:22.197 #define SPDK_CONFIG_NVME_CUSE 1 00:07:22.197 #undef SPDK_CONFIG_OCF 00:07:22.197 #define SPDK_CONFIG_OCF_PATH 00:07:22.197 #define SPDK_CONFIG_OPENSSL_PATH 00:07:22.197 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:22.197 #define SPDK_CONFIG_PGO_DIR 00:07:22.197 #undef SPDK_CONFIG_PGO_USE 00:07:22.197 #define SPDK_CONFIG_PREFIX /usr/local 00:07:22.197 #undef SPDK_CONFIG_RAID5F 00:07:22.197 #undef SPDK_CONFIG_RBD 00:07:22.197 #define SPDK_CONFIG_RDMA 1 00:07:22.197 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:22.197 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:22.197 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:22.197 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:22.197 #undef SPDK_CONFIG_SHARED 00:07:22.197 #undef SPDK_CONFIG_SMA 00:07:22.197 #define SPDK_CONFIG_TESTS 1 00:07:22.197 #undef SPDK_CONFIG_TSAN 00:07:22.197 #define SPDK_CONFIG_UBLK 1 00:07:22.197 #define SPDK_CONFIG_UBSAN 1 00:07:22.197 #undef SPDK_CONFIG_UNIT_TESTS 00:07:22.197 #undef SPDK_CONFIG_URING 00:07:22.197 #define SPDK_CONFIG_URING_PATH 00:07:22.197 #undef SPDK_CONFIG_URING_ZNS 00:07:22.197 #undef SPDK_CONFIG_USDT 00:07:22.197 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:22.197 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:22.197 #define SPDK_CONFIG_VFIO_USER 1 00:07:22.197 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:22.197 #define SPDK_CONFIG_VHOST 1 00:07:22.197 #define SPDK_CONFIG_VIRTIO 1 00:07:22.197 #undef SPDK_CONFIG_VTUNE 00:07:22.197 #define SPDK_CONFIG_VTUNE_DIR 00:07:22.197 #define SPDK_CONFIG_WERROR 1 00:07:22.197 #define SPDK_CONFIG_WPDK_DIR 00:07:22.197 #undef SPDK_CONFIG_XNVME 00:07:22.197 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:22.197 19:14:08 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:22.197 19:14:08 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:22.197 19:14:08 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:22.197 19:14:08 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:22.197 19:14:08 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:22.197 19:14:08 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.197 19:14:08 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.197 19:14:08 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.197 19:14:08 -- paths/export.sh@5 -- # export PATH 00:07:22.197 19:14:08 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.197 19:14:08 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:22.197 19:14:08 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:22.197 19:14:08 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:22.197 19:14:08 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:22.197 19:14:08 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:22.197 19:14:08 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:22.197 19:14:08 -- pm/common@67 -- # TEST_TAG=N/A 00:07:22.197 19:14:08 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:22.197 19:14:08 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:22.197 19:14:08 -- pm/common@71 -- # uname -s 00:07:22.197 19:14:09 -- pm/common@71 -- # PM_OS=Linux 00:07:22.197 19:14:09 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:22.197 19:14:09 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:07:22.197 19:14:09 -- pm/common@76 -- # [[ Linux == Linux ]] 00:07:22.197 19:14:09 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:07:22.197 19:14:09 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:07:22.197 19:14:09 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:22.197 19:14:09 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:22.197 19:14:09 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:07:22.197 19:14:09 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:07:22.197 19:14:09 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:22.197 19:14:09 -- common/autotest_common.sh@57 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:07:22.197 19:14:09 -- common/autotest_common.sh@61 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:22.197 19:14:09 -- common/autotest_common.sh@63 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:07:22.197 19:14:09 -- common/autotest_common.sh@65 -- # : 1 00:07:22.197 19:14:09 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:22.197 19:14:09 -- common/autotest_common.sh@67 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:07:22.197 19:14:09 -- common/autotest_common.sh@69 -- # : 00:07:22.197 19:14:09 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:07:22.197 19:14:09 -- common/autotest_common.sh@71 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:07:22.197 19:14:09 -- common/autotest_common.sh@73 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:07:22.197 19:14:09 -- common/autotest_common.sh@75 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:07:22.197 19:14:09 -- common/autotest_common.sh@77 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:22.197 19:14:09 -- common/autotest_common.sh@79 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:07:22.197 19:14:09 -- common/autotest_common.sh@81 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:07:22.197 19:14:09 -- common/autotest_common.sh@83 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:07:22.197 19:14:09 -- common/autotest_common.sh@85 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:07:22.197 19:14:09 -- common/autotest_common.sh@87 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:07:22.197 19:14:09 -- common/autotest_common.sh@89 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:07:22.197 19:14:09 -- common/autotest_common.sh@91 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:07:22.197 19:14:09 -- common/autotest_common.sh@93 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:07:22.197 19:14:09 -- common/autotest_common.sh@95 -- # : 0 00:07:22.197 19:14:09 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:22.197 19:14:09 -- common/autotest_common.sh@97 -- # : 1 00:07:22.197 19:14:09 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:07:22.197 19:14:09 -- common/autotest_common.sh@99 -- # : 1 00:07:22.197 19:14:09 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:07:22.198 19:14:09 -- common/autotest_common.sh@101 -- # : rdma 00:07:22.198 19:14:09 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:22.198 19:14:09 -- common/autotest_common.sh@103 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:07:22.198 19:14:09 -- common/autotest_common.sh@105 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:07:22.198 19:14:09 -- common/autotest_common.sh@107 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:07:22.198 19:14:09 -- common/autotest_common.sh@109 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:07:22.198 19:14:09 -- common/autotest_common.sh@111 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:07:22.198 19:14:09 -- common/autotest_common.sh@113 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:07:22.198 19:14:09 -- common/autotest_common.sh@115 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:07:22.198 19:14:09 -- common/autotest_common.sh@117 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:22.198 19:14:09 -- common/autotest_common.sh@119 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:07:22.198 19:14:09 -- common/autotest_common.sh@121 -- # : 1 00:07:22.198 19:14:09 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:07:22.198 19:14:09 -- common/autotest_common.sh@123 -- # : 00:07:22.198 19:14:09 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:22.198 19:14:09 -- common/autotest_common.sh@125 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:07:22.198 19:14:09 -- common/autotest_common.sh@127 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:07:22.198 19:14:09 -- common/autotest_common.sh@129 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:07:22.198 19:14:09 -- common/autotest_common.sh@131 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:07:22.198 19:14:09 -- common/autotest_common.sh@133 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:07:22.198 19:14:09 -- common/autotest_common.sh@135 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:07:22.198 19:14:09 -- common/autotest_common.sh@137 -- # : 00:07:22.198 19:14:09 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:07:22.198 19:14:09 -- common/autotest_common.sh@139 -- # : true 00:07:22.198 19:14:09 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:07:22.198 19:14:09 -- common/autotest_common.sh@141 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:07:22.198 19:14:09 -- common/autotest_common.sh@143 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:07:22.198 19:14:09 -- common/autotest_common.sh@145 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:07:22.198 19:14:09 -- common/autotest_common.sh@147 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:07:22.198 19:14:09 -- common/autotest_common.sh@149 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:07:22.198 19:14:09 -- common/autotest_common.sh@151 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:07:22.198 19:14:09 -- common/autotest_common.sh@153 -- # : 00:07:22.198 19:14:09 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:07:22.198 19:14:09 -- common/autotest_common.sh@155 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:07:22.198 19:14:09 -- common/autotest_common.sh@157 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:07:22.198 19:14:09 -- common/autotest_common.sh@159 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:07:22.198 19:14:09 -- common/autotest_common.sh@161 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:07:22.198 19:14:09 -- common/autotest_common.sh@163 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:07:22.198 19:14:09 -- common/autotest_common.sh@166 -- # : 00:07:22.198 19:14:09 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:07:22.198 19:14:09 -- common/autotest_common.sh@168 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:07:22.198 19:14:09 -- common/autotest_common.sh@170 -- # : 0 00:07:22.198 19:14:09 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:22.198 19:14:09 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:22.198 19:14:09 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:22.198 19:14:09 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:22.198 19:14:09 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:22.198 19:14:09 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:22.198 19:14:09 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:22.198 19:14:09 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:07:22.198 19:14:09 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:22.198 19:14:09 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:22.198 19:14:09 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:22.198 19:14:09 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:22.198 19:14:09 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:22.198 19:14:09 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:07:22.198 19:14:09 -- common/autotest_common.sh@199 -- # cat 00:07:22.198 19:14:09 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:07:22.198 19:14:09 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:22.198 19:14:09 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:22.198 19:14:09 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:22.198 19:14:09 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:22.198 19:14:09 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:07:22.198 19:14:09 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:07:22.198 19:14:09 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:22.198 19:14:09 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:22.198 19:14:09 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:22.198 19:14:09 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:22.198 19:14:09 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:22.198 19:14:09 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:22.198 19:14:09 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:22.198 19:14:09 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:22.198 19:14:09 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:22.198 19:14:09 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:22.198 19:14:09 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:22.198 19:14:09 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:22.199 19:14:09 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:07:22.199 19:14:09 -- common/autotest_common.sh@252 -- # export valgrind= 00:07:22.199 19:14:09 -- common/autotest_common.sh@252 -- # valgrind= 00:07:22.199 19:14:09 -- common/autotest_common.sh@258 -- # uname -s 00:07:22.199 19:14:09 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:07:22.199 19:14:09 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:07:22.199 19:14:09 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:07:22.199 19:14:09 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:07:22.199 19:14:09 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@268 -- # MAKE=make 00:07:22.199 19:14:09 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j72 00:07:22.199 19:14:09 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:07:22.199 19:14:09 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:07:22.199 19:14:09 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:07:22.199 19:14:09 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:07:22.199 19:14:09 -- common/autotest_common.sh@307 -- # [[ -z 1619898 ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@307 -- # kill -0 1619898 00:07:22.199 19:14:09 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:07:22.199 19:14:09 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:07:22.199 19:14:09 -- common/autotest_common.sh@320 -- # local mount target_dir 00:07:22.199 19:14:09 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:07:22.199 19:14:09 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:07:22.199 19:14:09 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:07:22.199 19:14:09 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:07:22.199 19:14:09 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.uapHCc 00:07:22.199 19:14:09 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:22.199 19:14:09 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.uapHCc/tests/nvmf /tmp/spdk.uapHCc 00:07:22.199 19:14:09 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@316 -- # df -T 00:07:22.199 19:14:09 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:07:22.199 19:14:09 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # avails["$mount"]=818380800 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:07:22.199 19:14:09 -- common/autotest_common.sh@352 -- # uses["$mount"]=4466049024 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # avails["$mount"]=86764023808 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # sizes["$mount"]=94508572672 00:07:22.199 19:14:09 -- common/autotest_common.sh@352 -- # uses["$mount"]=7744548864 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # avails["$mount"]=47249575936 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254286336 00:07:22.199 19:14:09 -- common/autotest_common.sh@352 -- # uses["$mount"]=4710400 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # avails["$mount"]=18895835136 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # sizes["$mount"]=18901716992 00:07:22.199 19:14:09 -- common/autotest_common.sh@352 -- # uses["$mount"]=5881856 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # avails["$mount"]=47253733376 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254286336 00:07:22.199 19:14:09 -- common/autotest_common.sh@352 -- # uses["$mount"]=552960 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # avails["$mount"]=9450852352 00:07:22.199 19:14:09 -- common/autotest_common.sh@351 -- # sizes["$mount"]=9450856448 00:07:22.199 19:14:09 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:07:22.199 19:14:09 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:07:22.199 19:14:09 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:07:22.199 * Looking for test storage... 00:07:22.199 19:14:09 -- common/autotest_common.sh@357 -- # local target_space new_size 00:07:22.199 19:14:09 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:07:22.199 19:14:09 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:22.199 19:14:09 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:22.199 19:14:09 -- common/autotest_common.sh@361 -- # mount=/ 00:07:22.199 19:14:09 -- common/autotest_common.sh@363 -- # target_space=86764023808 00:07:22.199 19:14:09 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:07:22.199 19:14:09 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:07:22.199 19:14:09 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@370 -- # new_size=9959141376 00:07:22.199 19:14:09 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:22.199 19:14:09 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:22.199 19:14:09 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:22.199 19:14:09 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:22.199 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:22.199 19:14:09 -- common/autotest_common.sh@378 -- # return 0 00:07:22.199 19:14:09 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:07:22.199 19:14:09 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:07:22.199 19:14:09 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:22.199 19:14:09 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:22.199 19:14:09 -- common/autotest_common.sh@1673 -- # true 00:07:22.199 19:14:09 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:07:22.199 19:14:09 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:22.199 19:14:09 -- common/autotest_common.sh@27 -- # exec 00:07:22.199 19:14:09 -- common/autotest_common.sh@29 -- # exec 00:07:22.199 19:14:09 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:22.199 19:14:09 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:22.199 19:14:09 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:22.199 19:14:09 -- common/autotest_common.sh@18 -- # set -x 00:07:22.199 19:14:09 -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:22.199 19:14:09 -- ../common.sh@8 -- # pids=() 00:07:22.199 19:14:09 -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:22.199 19:14:09 -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:22.199 19:14:09 -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:22.199 19:14:09 -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:22.199 19:14:09 -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:22.199 19:14:09 -- nvmf/run.sh@69 -- # mem_size=512 00:07:22.199 19:14:09 -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:22.199 19:14:09 -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:22.199 19:14:09 -- ../common.sh@69 -- # local fuzz_num=25 00:07:22.199 19:14:09 -- ../common.sh@70 -- # local time=1 00:07:22.199 19:14:09 -- ../common.sh@72 -- # (( i = 0 )) 00:07:22.199 19:14:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:22.199 19:14:09 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:22.199 19:14:09 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:22.199 19:14:09 -- nvmf/run.sh@24 -- # local timen=1 00:07:22.199 19:14:09 -- nvmf/run.sh@25 -- # local core=0x1 00:07:22.199 19:14:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:22.199 19:14:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:22.199 19:14:09 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:22.199 19:14:09 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:22.199 19:14:09 -- nvmf/run.sh@34 -- # printf %02d 0 00:07:22.199 19:14:09 -- nvmf/run.sh@34 -- # port=4400 00:07:22.199 19:14:09 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:22.200 19:14:09 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:22.200 19:14:09 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:22.200 19:14:09 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:22.200 19:14:09 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:22.200 19:14:09 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:22.200 [2024-04-24 19:14:09.154700] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:22.200 [2024-04-24 19:14:09.154777] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1619943 ] 00:07:22.200 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.457 [2024-04-24 19:14:09.337040] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.457 [2024-04-24 19:14:09.408691] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.457 [2024-04-24 19:14:09.468115] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:22.714 [2024-04-24 19:14:09.484332] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:22.714 INFO: Running with entropic power schedule (0xFF, 100). 00:07:22.714 INFO: Seed: 3172443008 00:07:22.714 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:22.714 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:22.714 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:22.714 INFO: A corpus is not provided, starting from an empty corpus 00:07:22.715 #2 INITED exec/s: 0 rss: 64Mb 00:07:22.715 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:22.715 This may also happen if the target rejected all inputs we tried so far 00:07:22.715 [2024-04-24 19:14:09.529007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:22.715 [2024-04-24 19:14:09.529044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.972 NEW_FUNC[1/669]: 0x481d00 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:22.972 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:22.972 #4 NEW cov: 11630 ft: 11632 corp: 2/71b lim: 320 exec/s: 0 rss: 71Mb L: 70/70 MS: 2 InsertRepeatedBytes-CopyPart- 00:07:22.972 [2024-04-24 19:14:09.902103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:22.972 [2024-04-24 19:14:09.902157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:22.972 NEW_FUNC[1/2]: 0xf21f40 in rte_get_tsc_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/include/rte_cycles.h:61 00:07:22.972 NEW_FUNC[2/2]: 0xfad2c0 in spdk_sock_prep_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/sock.h:297 00:07:22.972 #5 NEW cov: 11763 ft: 12278 corp: 3/193b lim: 320 exec/s: 0 rss: 71Mb L: 122/122 MS: 1 CopyPart- 00:07:22.972 [2024-04-24 19:14:09.962244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:22.972 [2024-04-24 19:14:09.962274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.229 #6 NEW cov: 11769 ft: 12444 corp: 4/315b lim: 320 exec/s: 0 rss: 71Mb L: 122/122 MS: 1 ChangeByte- 00:07:23.230 [2024-04-24 19:14:10.022622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffff08 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.230 [2024-04-24 19:14:10.022653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.230 #7 NEW cov: 11854 ft: 12755 corp: 5/437b lim: 320 exec/s: 0 rss: 71Mb L: 122/122 MS: 1 ChangeBinInt- 00:07:23.230 [2024-04-24 19:14:10.072903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffff08 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.230 [2024-04-24 19:14:10.072935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.230 #8 NEW cov: 11854 ft: 12812 corp: 6/559b lim: 320 exec/s: 0 rss: 72Mb L: 122/122 MS: 1 ShuffleBytes- 00:07:23.230 [2024-04-24 19:14:10.133031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:fffffff7 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.230 [2024-04-24 19:14:10.133065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.230 #9 NEW cov: 11854 ft: 12981 corp: 7/681b lim: 320 exec/s: 0 rss: 72Mb L: 122/122 MS: 1 ChangeBit- 00:07:23.230 [2024-04-24 19:14:10.193707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:fffffff7 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.230 [2024-04-24 19:14:10.193735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.230 [2024-04-24 19:14:10.193841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.230 [2024-04-24 19:14:10.193864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.230 [2024-04-24 19:14:10.193982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffff0aff cdw11:ffffff3b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.230 [2024-04-24 19:14:10.194004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.230 #10 NEW cov: 11874 ft: 13358 corp: 8/877b lim: 320 exec/s: 0 rss: 72Mb L: 196/196 MS: 1 InsertRepeatedBytes- 00:07:23.488 [2024-04-24 19:14:10.253548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.488 [2024-04-24 19:14:10.253577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.488 #11 NEW cov: 11874 ft: 13465 corp: 9/947b lim: 320 exec/s: 0 rss: 72Mb L: 70/196 MS: 1 CopyPart- 00:07:23.488 #17 NEW cov: 11878 ft: 13549 corp: 10/1037b lim: 320 exec/s: 0 rss: 72Mb L: 90/196 MS: 1 CrossOver- 00:07:23.488 [2024-04-24 19:14:10.354182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.488 [2024-04-24 19:14:10.354211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.488 [2024-04-24 19:14:10.354331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffff0aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xaffffffffffffff 00:07:23.488 [2024-04-24 19:14:10.354348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.488 #18 NEW cov: 11878 ft: 13722 corp: 11/1213b lim: 320 exec/s: 0 rss: 72Mb L: 176/196 MS: 1 CopyPart- 00:07:23.488 [2024-04-24 19:14:10.404151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.488 [2024-04-24 19:14:10.404179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.488 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:23.488 #24 NEW cov: 11895 ft: 13772 corp: 12/1283b lim: 320 exec/s: 0 rss: 72Mb L: 70/196 MS: 1 ShuffleBytes- 00:07:23.488 [2024-04-24 19:14:10.454820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.488 [2024-04-24 19:14:10.454849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.488 [2024-04-24 19:14:10.454960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:23.488 [2024-04-24 19:14:10.454976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.488 [2024-04-24 19:14:10.455074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.488 [2024-04-24 19:14:10.455091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.488 #25 NEW cov: 11895 ft: 13820 corp: 13/1508b lim: 320 exec/s: 0 rss: 72Mb L: 225/225 MS: 1 InsertRepeatedBytes- 00:07:23.746 [2024-04-24 19:14:10.504981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.505009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.505110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:affffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffffffffffff7 00:07:23.746 [2024-04-24 19:14:10.505127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.505229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffff3b cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.505245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.746 #26 NEW cov: 11895 ft: 13887 corp: 14/1700b lim: 320 exec/s: 26 rss: 72Mb L: 192/225 MS: 1 CrossOver- 00:07:23.746 [2024-04-24 19:14:10.555331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:fffffff7 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.555359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.555460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:23.746 [2024-04-24 19:14:10.555477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.555575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffff0aff cdw11:ffffff3b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.555590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.746 #27 NEW cov: 11895 ft: 13899 corp: 15/1952b lim: 320 exec/s: 27 rss: 72Mb L: 252/252 MS: 1 CopyPart- 00:07:23.746 [2024-04-24 19:14:10.615427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.615455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.615561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:affffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xff08fffffffffff7 00:07:23.746 [2024-04-24 19:14:10.615577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.615673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffff3b cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.615688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.746 #28 NEW cov: 11895 ft: 13903 corp: 16/2144b lim: 320 exec/s: 28 rss: 72Mb L: 192/252 MS: 1 ChangeBinInt- 00:07:23.746 [2024-04-24 19:14:10.675932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.675959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.676082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffff0aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xaffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.676111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.676205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.676221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.676315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:7 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.676333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:23.746 #29 NEW cov: 11895 ft: 14113 corp: 17/2425b lim: 320 exec/s: 29 rss: 72Mb L: 281/281 MS: 1 CopyPart- 00:07:23.746 [2024-04-24 19:14:10.735774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.735802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:23.746 [2024-04-24 19:14:10.735906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffff0aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:23.746 [2024-04-24 19:14:10.735922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:23.746 #30 NEW cov: 11895 ft: 14135 corp: 18/2579b lim: 320 exec/s: 30 rss: 72Mb L: 154/281 MS: 1 EraseBytes- 00:07:24.003 [2024-04-24 19:14:10.785736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffff08 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.003 [2024-04-24 19:14:10.785764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.003 #31 NEW cov: 11895 ft: 14182 corp: 19/2701b lim: 320 exec/s: 31 rss: 72Mb L: 122/281 MS: 1 ChangeBinInt- 00:07:24.003 [2024-04-24 19:14:10.835955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffff3b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.003 [2024-04-24 19:14:10.835983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.003 #32 NEW cov: 11895 ft: 14192 corp: 20/2769b lim: 320 exec/s: 32 rss: 72Mb L: 68/281 MS: 1 EraseBytes- 00:07:24.003 [2024-04-24 19:14:10.886201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ff0affff cdw11:ffffff08 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.003 [2024-04-24 19:14:10.886231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.004 #33 NEW cov: 11895 ft: 14211 corp: 21/2891b lim: 320 exec/s: 33 rss: 72Mb L: 122/281 MS: 1 ShuffleBytes- 00:07:24.004 [2024-04-24 19:14:10.946825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.004 [2024-04-24 19:14:10.946852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.004 [2024-04-24 19:14:10.946965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.004 [2024-04-24 19:14:10.946982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.004 [2024-04-24 19:14:10.947069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:24.004 [2024-04-24 19:14:10.947084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.004 #34 NEW cov: 11895 ft: 14215 corp: 22/3116b lim: 320 exec/s: 34 rss: 73Mb L: 225/281 MS: 1 ChangeByte- 00:07:24.004 [2024-04-24 19:14:11.006956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.004 [2024-04-24 19:14:11.006986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.004 [2024-04-24 19:14:11.007106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:affffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xff08fffffffffff7 00:07:24.004 [2024-04-24 19:14:11.007122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.004 [2024-04-24 19:14:11.007228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffff3b cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.004 [2024-04-24 19:14:11.007244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.261 #35 NEW cov: 11895 ft: 14237 corp: 23/3308b lim: 320 exec/s: 35 rss: 73Mb L: 192/281 MS: 1 ShuffleBytes- 00:07:24.261 [2024-04-24 19:14:11.067285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.067315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.067422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffff0aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xaffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.067438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.067541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.067558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.067656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:7 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.067673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.261 #36 NEW cov: 11895 ft: 14244 corp: 24/3589b lim: 320 exec/s: 36 rss: 73Mb L: 281/281 MS: 1 CMP- DE: "\000\000"- 00:07:24.261 [2024-04-24 19:14:11.127346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.127376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.127488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:affffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffffffffffff7 00:07:24.261 [2024-04-24 19:14:11.127507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.127610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffff3b cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.127628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.261 #37 NEW cov: 11895 ft: 14258 corp: 25/3781b lim: 320 exec/s: 37 rss: 73Mb L: 192/281 MS: 1 ChangeBit- 00:07:24.261 [2024-04-24 19:14:11.177618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.177647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.177759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:affffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffffffffffff7 00:07:24.261 [2024-04-24 19:14:11.177776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.177872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffff3b cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.261 [2024-04-24 19:14:11.177888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.261 #38 NEW cov: 11895 ft: 14285 corp: 26/3973b lim: 320 exec/s: 38 rss: 73Mb L: 192/281 MS: 1 ShuffleBytes- 00:07:24.261 [2024-04-24 19:14:11.228241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.261 [2024-04-24 19:14:11.228271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.228373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:ffffff00 cdw10:ffffffff cdw11:ffffffff 00:07:24.261 [2024-04-24 19:14:11.228390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.228507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:fffff7ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.261 [2024-04-24 19:14:11.228526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.261 [2024-04-24 19:14:11.228621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:24.261 [2024-04-24 19:14:11.228638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:24.261 #39 NEW cov: 11895 ft: 14321 corp: 27/4258b lim: 320 exec/s: 39 rss: 73Mb L: 285/285 MS: 1 CopyPart- 00:07:24.519 [2024-04-24 19:14:11.278049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.278085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.519 [2024-04-24 19:14:11.278188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffff0aff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xaffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.278208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.519 #40 NEW cov: 11895 ft: 14332 corp: 28/4434b lim: 320 exec/s: 40 rss: 73Mb L: 176/285 MS: 1 ChangeBit- 00:07:24.519 [2024-04-24 19:14:11.328325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.328352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.519 [2024-04-24 19:14:11.328460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:affffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffffffffffff7 00:07:24.519 [2024-04-24 19:14:11.328476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.519 [2024-04-24 19:14:11.328569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffff3b cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.328590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.519 #41 NEW cov: 11895 ft: 14347 corp: 29/4626b lim: 320 exec/s: 41 rss: 73Mb L: 192/285 MS: 1 CopyPart- 00:07:24.519 [2024-04-24 19:14:11.388525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.388552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.519 [2024-04-24 19:14:11.388652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:affffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffffffffffff7 00:07:24.519 [2024-04-24 19:14:11.388668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.519 [2024-04-24 19:14:11.388767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffff3b cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.388781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.519 #42 NEW cov: 11902 ft: 14351 corp: 30/4818b lim: 320 exec/s: 42 rss: 73Mb L: 192/285 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:24.519 [2024-04-24 19:14:11.448926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffff0aff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.448954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.519 [2024-04-24 19:14:11.449068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.519 [2024-04-24 19:14:11.449084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:24.519 [2024-04-24 19:14:11.449175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff 00:07:24.519 [2024-04-24 19:14:11.449191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:24.519 #43 NEW cov: 11902 ft: 14357 corp: 31/5043b lim: 320 exec/s: 43 rss: 73Mb L: 225/285 MS: 1 CrossOver- 00:07:24.519 [2024-04-24 19:14:11.508829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:24.519 [2024-04-24 19:14:11.508858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.519 #44 NEW cov: 11902 ft: 14398 corp: 32/5124b lim: 320 exec/s: 22 rss: 73Mb L: 81/285 MS: 1 EraseBytes- 00:07:24.519 #44 DONE cov: 11902 ft: 14398 corp: 32/5124b lim: 320 exec/s: 22 rss: 73Mb 00:07:24.519 ###### Recommended dictionary. ###### 00:07:24.519 "\000\000" # Uses: 1 00:07:24.519 ###### End of recommended dictionary. ###### 00:07:24.519 Done 44 runs in 2 second(s) 00:07:24.776 19:14:11 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:24.776 19:14:11 -- ../common.sh@72 -- # (( i++ )) 00:07:24.776 19:14:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:24.776 19:14:11 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:24.776 19:14:11 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:24.776 19:14:11 -- nvmf/run.sh@24 -- # local timen=1 00:07:24.776 19:14:11 -- nvmf/run.sh@25 -- # local core=0x1 00:07:24.776 19:14:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:24.776 19:14:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:24.776 19:14:11 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:24.776 19:14:11 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:24.776 19:14:11 -- nvmf/run.sh@34 -- # printf %02d 1 00:07:24.776 19:14:11 -- nvmf/run.sh@34 -- # port=4401 00:07:24.776 19:14:11 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:24.776 19:14:11 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:24.776 19:14:11 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:24.776 19:14:11 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:24.776 19:14:11 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:24.776 19:14:11 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:24.776 [2024-04-24 19:14:11.695135] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:24.776 [2024-04-24 19:14:11.695202] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1620302 ] 00:07:24.776 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.034 [2024-04-24 19:14:11.879177] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.034 [2024-04-24 19:14:11.951379] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.034 [2024-04-24 19:14:12.010762] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:25.034 [2024-04-24 19:14:12.026964] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:25.034 INFO: Running with entropic power schedule (0xFF, 100). 00:07:25.034 INFO: Seed: 1419478055 00:07:25.292 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:25.292 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:25.292 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:25.292 INFO: A corpus is not provided, starting from an empty corpus 00:07:25.292 #2 INITED exec/s: 0 rss: 64Mb 00:07:25.292 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:25.292 This may also happen if the target rejected all inputs we tried so far 00:07:25.292 [2024-04-24 19:14:12.071646] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:25.292 [2024-04-24 19:14:12.071725] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:25.292 [2024-04-24 19:14:12.071841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:230a810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.292 [2024-04-24 19:14:12.071864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.292 [2024-04-24 19:14:12.071897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eded81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.292 [2024-04-24 19:14:12.071913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.550 NEW_FUNC[1/671]: 0x482600 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:25.550 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:25.550 #7 NEW cov: 11699 ft: 11700 corp: 2/13b lim: 30 exec/s: 0 rss: 71Mb L: 12/12 MS: 5 ShuffleBytes-InsertByte-CrossOver-InsertByte-InsertRepeatedBytes- 00:07:25.550 [2024-04-24 19:14:12.412469] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.550 [2024-04-24 19:14:12.412613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.550 [2024-04-24 19:14:12.412643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.550 #8 NEW cov: 11829 ft: 12424 corp: 3/23b lim: 30 exec/s: 0 rss: 71Mb L: 10/12 MS: 1 InsertRepeatedBytes- 00:07:25.550 [2024-04-24 19:14:12.472470] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.550 [2024-04-24 19:14:12.472590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff2383ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.550 [2024-04-24 19:14:12.472613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.550 #9 NEW cov: 11835 ft: 12788 corp: 4/33b lim: 30 exec/s: 0 rss: 71Mb L: 10/12 MS: 1 ChangeByte- 00:07:25.550 [2024-04-24 19:14:12.542657] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000edcd 00:07:25.550 [2024-04-24 19:14:12.542748] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:25.550 [2024-04-24 19:14:12.542863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:230a810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.550 [2024-04-24 19:14:12.542885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.550 [2024-04-24 19:14:12.542917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eded81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.550 [2024-04-24 19:14:12.542933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.808 #10 NEW cov: 11920 ft: 13020 corp: 5/45b lim: 30 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 ChangeBit- 00:07:25.808 [2024-04-24 19:14:12.612851] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.808 [2024-04-24 19:14:12.612989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff2b83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.808 [2024-04-24 19:14:12.613012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 #11 NEW cov: 11920 ft: 13130 corp: 6/55b lim: 30 exec/s: 0 rss: 72Mb L: 10/12 MS: 1 ChangeBit- 00:07:25.808 [2024-04-24 19:14:12.683005] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.808 [2024-04-24 19:14:12.683151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.808 [2024-04-24 19:14:12.683175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 #12 NEW cov: 11920 ft: 13215 corp: 7/65b lim: 30 exec/s: 0 rss: 72Mb L: 10/12 MS: 1 ShuffleBytes- 00:07:25.808 [2024-04-24 19:14:12.733107] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:25.808 [2024-04-24 19:14:12.733245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.808 [2024-04-24 19:14:12.733268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.808 #13 NEW cov: 11920 ft: 13283 corp: 8/75b lim: 30 exec/s: 0 rss: 72Mb L: 10/12 MS: 1 ChangeBit- 00:07:25.808 [2024-04-24 19:14:12.783294] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000023ff 00:07:25.808 [2024-04-24 19:14:12.783414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.808 [2024-04-24 19:14:12.783436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.066 #14 NEW cov: 11920 ft: 13415 corp: 9/82b lim: 30 exec/s: 0 rss: 72Mb L: 7/12 MS: 1 EraseBytes- 00:07:26.066 [2024-04-24 19:14:12.853468] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000edcd 00:07:26.066 [2024-04-24 19:14:12.853556] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:26.066 [2024-04-24 19:14:12.853668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:230a810a cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.066 [2024-04-24 19:14:12.853688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.066 [2024-04-24 19:14:12.853720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0aed81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.066 [2024-04-24 19:14:12.853736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.066 #15 NEW cov: 11920 ft: 13452 corp: 10/97b lim: 30 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 CopyPart- 00:07:26.066 [2024-04-24 19:14:12.913613] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000023ff 00:07:26.066 [2024-04-24 19:14:12.913753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.066 [2024-04-24 19:14:12.913776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.066 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:26.066 #18 NEW cov: 11937 ft: 13487 corp: 11/104b lim: 30 exec/s: 0 rss: 72Mb L: 7/15 MS: 3 EraseBytes-ChangeBit-CopyPart- 00:07:26.066 [2024-04-24 19:14:12.983828] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001232 00:07:26.066 [2024-04-24 19:14:12.983904] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000012ed 00:07:26.066 [2024-04-24 19:14:12.984013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:230a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.066 [2024-04-24 19:14:12.984034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.066 [2024-04-24 19:14:12.984073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f5120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.066 [2024-04-24 19:14:12.984090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.066 #19 NEW cov: 11937 ft: 13522 corp: 12/119b lim: 30 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 ChangeBinInt- 00:07:26.066 [2024-04-24 19:14:13.043957] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007f23 00:07:26.066 [2024-04-24 19:14:13.044046] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000023ff 00:07:26.066 [2024-04-24 19:14:13.044162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.066 [2024-04-24 19:14:13.044183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.066 [2024-04-24 19:14:13.044215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.066 [2024-04-24 19:14:13.044231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.325 #20 NEW cov: 11937 ft: 13542 corp: 13/132b lim: 30 exec/s: 20 rss: 72Mb L: 13/15 MS: 1 CopyPart- 00:07:26.325 [2024-04-24 19:14:13.104133] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.325 [2024-04-24 19:14:13.104221] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000aed 00:07:26.325 [2024-04-24 19:14:13.104281] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:26.325 [2024-04-24 19:14:13.104399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.325 [2024-04-24 19:14:13.104420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.325 [2024-04-24 19:14:13.104452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff0223 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.325 [2024-04-24 19:14:13.104469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.325 [2024-04-24 19:14:13.104497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:edcd81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.325 [2024-04-24 19:14:13.104513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.325 #21 NEW cov: 11937 ft: 13858 corp: 14/152b lim: 30 exec/s: 21 rss: 72Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:26.325 [2024-04-24 19:14:13.164239] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.325 [2024-04-24 19:14:13.164375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff2383ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.325 [2024-04-24 19:14:13.164398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.325 #22 NEW cov: 11937 ft: 13870 corp: 15/160b lim: 30 exec/s: 22 rss: 72Mb L: 8/20 MS: 1 EraseBytes- 00:07:26.325 [2024-04-24 19:14:13.214389] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001232 00:07:26.326 [2024-04-24 19:14:13.214477] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000012ed 00:07:26.326 [2024-04-24 19:14:13.214537] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.326 [2024-04-24 19:14:13.214644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:230a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.326 [2024-04-24 19:14:13.214665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.326 [2024-04-24 19:14:13.214697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:f5120212 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.326 [2024-04-24 19:14:13.214713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.326 [2024-04-24 19:14:13.214742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.326 [2024-04-24 19:14:13.214758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.326 #23 NEW cov: 11937 ft: 13926 corp: 16/180b lim: 30 exec/s: 23 rss: 72Mb L: 20/20 MS: 1 CrossOver- 00:07:26.326 [2024-04-24 19:14:13.284578] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:26.326 [2024-04-24 19:14:13.284717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff2302ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.326 [2024-04-24 19:14:13.284740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.326 #24 NEW cov: 11937 ft: 13965 corp: 17/188b lim: 30 exec/s: 24 rss: 73Mb L: 8/20 MS: 1 ChangeBit- 00:07:26.584 [2024-04-24 19:14:13.354766] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.584 [2024-04-24 19:14:13.354898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff2b83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.584 [2024-04-24 19:14:13.354921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.584 #25 NEW cov: 11937 ft: 13980 corp: 18/196b lim: 30 exec/s: 25 rss: 73Mb L: 8/20 MS: 1 EraseBytes- 00:07:26.584 [2024-04-24 19:14:13.424968] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007f3b 00:07:26.584 [2024-04-24 19:14:13.425057] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007f23 00:07:26.584 [2024-04-24 19:14:13.425172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.584 [2024-04-24 19:14:13.425194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.584 [2024-04-24 19:14:13.425226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:23ff020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.584 [2024-04-24 19:14:13.425243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.584 #26 NEW cov: 11937 ft: 13985 corp: 19/210b lim: 30 exec/s: 26 rss: 73Mb L: 14/20 MS: 1 InsertByte- 00:07:26.585 [2024-04-24 19:14:13.485163] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.585 [2024-04-24 19:14:13.485250] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000aed 00:07:26.585 [2024-04-24 19:14:13.485310] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.585 [2024-04-24 19:14:13.485368] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000edcd 00:07:26.585 [2024-04-24 19:14:13.485425] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:26.585 [2024-04-24 19:14:13.485534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.585 [2024-04-24 19:14:13.485555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 [2024-04-24 19:14:13.485587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff0223 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.585 [2024-04-24 19:14:13.485603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.585 [2024-04-24 19:14:13.485632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.585 [2024-04-24 19:14:13.485648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.585 [2024-04-24 19:14:13.485676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.585 [2024-04-24 19:14:13.485692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.585 [2024-04-24 19:14:13.485719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:eded81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.585 [2024-04-24 19:14:13.485735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.585 #27 NEW cov: 11937 ft: 14527 corp: 20/240b lim: 30 exec/s: 27 rss: 73Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:26.585 [2024-04-24 19:14:13.555303] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.585 [2024-04-24 19:14:13.555444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff6483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.585 [2024-04-24 19:14:13.555467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.585 #28 NEW cov: 11937 ft: 14542 corp: 21/248b lim: 30 exec/s: 28 rss: 73Mb L: 8/30 MS: 1 ChangeByte- 00:07:26.842 [2024-04-24 19:14:13.605479] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007f23 00:07:26.842 [2024-04-24 19:14:13.605554] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007f23 00:07:26.842 [2024-04-24 19:14:13.605662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.605683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.605714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff3b020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.605731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.842 #29 NEW cov: 11937 ft: 14545 corp: 22/262b lim: 30 exec/s: 29 rss: 73Mb L: 14/30 MS: 1 ShuffleBytes- 00:07:26.842 [2024-04-24 19:14:13.675657] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.842 [2024-04-24 19:14:13.675733] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000aed 00:07:26.842 [2024-04-24 19:14:13.675793] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:26.842 [2024-04-24 19:14:13.675900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.675920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.675951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff81d7 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.675967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.675996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:edcd81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.676012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.842 #30 NEW cov: 11937 ft: 14641 corp: 23/282b lim: 30 exec/s: 30 rss: 73Mb L: 20/30 MS: 1 ChangeBinInt- 00:07:26.842 [2024-04-24 19:14:13.735843] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000023ff 00:07:26.842 [2024-04-24 19:14:13.735933] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000aed 00:07:26.842 [2024-04-24 19:14:13.735995] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.842 [2024-04-24 19:14:13.736054] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000edcd 00:07:26.842 [2024-04-24 19:14:13.736122] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:26.842 [2024-04-24 19:14:13.736246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.736267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.736299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.736315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.736343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.736359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.736391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.736407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.736435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:eded81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.736451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:26.842 #31 NEW cov: 11937 ft: 14726 corp: 24/312b lim: 30 exec/s: 31 rss: 73Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:26.842 [2024-04-24 19:14:13.805996] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:26.842 [2024-04-24 19:14:13.806080] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000aed 00:07:26.842 [2024-04-24 19:14:13.806158] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:26.842 [2024-04-24 19:14:13.806277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.806313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.806346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff0223 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.806363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.806392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:edcd81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.806408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.842 #32 NEW cov: 11937 ft: 14750 corp: 25/333b lim: 30 exec/s: 32 rss: 73Mb L: 21/30 MS: 1 InsertByte- 00:07:26.842 [2024-04-24 19:14:13.856113] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36864) > buf size (4096) 00:07:26.842 [2024-04-24 19:14:13.856189] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (90468) > buf size (4096) 00:07:26.842 [2024-04-24 19:14:13.856249] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (90468) > buf size (4096) 00:07:26.842 [2024-04-24 19:14:13.856307] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:26.842 [2024-04-24 19:14:13.856416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff0058 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.856437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.856469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:58580058 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.856485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.856514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:58580058 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.856530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.842 [2024-04-24 19:14:13.856559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:580a837f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.842 [2024-04-24 19:14:13.856574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.100 #33 NEW cov: 11960 ft: 14786 corp: 26/357b lim: 30 exec/s: 33 rss: 73Mb L: 24/30 MS: 1 InsertRepeatedBytes- 00:07:27.100 [2024-04-24 19:14:13.916254] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eded 00:07:27.100 [2024-04-24 19:14:13.916326] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:27.100 [2024-04-24 19:14:13.916429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2389020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.100 [2024-04-24 19:14:13.916449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.100 [2024-04-24 19:14:13.916479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cd0a81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.100 [2024-04-24 19:14:13.916495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.100 #34 NEW cov: 11960 ft: 14801 corp: 27/373b lim: 30 exec/s: 34 rss: 73Mb L: 16/30 MS: 1 InsertByte- 00:07:27.100 [2024-04-24 19:14:13.966371] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:27.100 [2024-04-24 19:14:13.966459] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000aed 00:07:27.100 [2024-04-24 19:14:13.966519] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (243512) > buf size (4096) 00:07:27.100 [2024-04-24 19:14:13.966576] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eded 00:07:27.100 [2024-04-24 19:14:13.966689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.100 [2024-04-24 19:14:13.966710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.100 [2024-04-24 19:14:13.966742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff0223 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.100 [2024-04-24 19:14:13.966758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.100 [2024-04-24 19:14:13.966788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:edcd00bc cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.100 [2024-04-24 19:14:13.966804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.100 [2024-04-24 19:14:13.966833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:bcbc81ed cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.100 [2024-04-24 19:14:13.966848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.100 #35 NEW cov: 11967 ft: 14861 corp: 28/400b lim: 30 exec/s: 35 rss: 73Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:27.100 [2024-04-24 19:14:14.036551] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:27.100 [2024-04-24 19:14:14.036672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff6483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.100 [2024-04-24 19:14:14.036695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.100 #36 NEW cov: 11967 ft: 14874 corp: 29/408b lim: 30 exec/s: 18 rss: 73Mb L: 8/30 MS: 1 ChangeBit- 00:07:27.100 #36 DONE cov: 11967 ft: 14874 corp: 29/408b lim: 30 exec/s: 18 rss: 73Mb 00:07:27.100 Done 36 runs in 2 second(s) 00:07:27.358 19:14:14 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:27.358 19:14:14 -- ../common.sh@72 -- # (( i++ )) 00:07:27.358 19:14:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:27.358 19:14:14 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:27.358 19:14:14 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:27.358 19:14:14 -- nvmf/run.sh@24 -- # local timen=1 00:07:27.358 19:14:14 -- nvmf/run.sh@25 -- # local core=0x1 00:07:27.358 19:14:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:27.358 19:14:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:27.358 19:14:14 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:27.358 19:14:14 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:27.358 19:14:14 -- nvmf/run.sh@34 -- # printf %02d 2 00:07:27.358 19:14:14 -- nvmf/run.sh@34 -- # port=4402 00:07:27.358 19:14:14 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:27.358 19:14:14 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:27.358 19:14:14 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:27.358 19:14:14 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:27.358 19:14:14 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:27.358 19:14:14 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:27.358 [2024-04-24 19:14:14.262424] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:27.358 [2024-04-24 19:14:14.262509] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1620658 ] 00:07:27.358 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.615 [2024-04-24 19:14:14.463431] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.615 [2024-04-24 19:14:14.536008] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.615 [2024-04-24 19:14:14.595467] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:27.615 [2024-04-24 19:14:14.611666] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:27.615 INFO: Running with entropic power schedule (0xFF, 100). 00:07:27.615 INFO: Seed: 4003490761 00:07:27.872 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:27.872 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:27.872 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:27.872 INFO: A corpus is not provided, starting from an empty corpus 00:07:27.872 #2 INITED exec/s: 0 rss: 64Mb 00:07:27.872 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:27.872 This may also happen if the target rejected all inputs we tried so far 00:07:27.872 [2024-04-24 19:14:14.657199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.872 [2024-04-24 19:14:14.657227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.872 [2024-04-24 19:14:14.657299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.872 [2024-04-24 19:14:14.657314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.872 [2024-04-24 19:14:14.657370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.873 [2024-04-24 19:14:14.657384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.130 NEW_FUNC[1/667]: 0x4850b0 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:28.130 NEW_FUNC[2/667]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:28.130 #8 NEW cov: 11631 ft: 11632 corp: 2/23b lim: 35 exec/s: 0 rss: 70Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:28.130 [2024-04-24 19:14:14.998002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:14.998041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.130 [2024-04-24 19:14:14.998120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:14.998135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.130 [2024-04-24 19:14:14.998195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:14.998221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.130 NEW_FUNC[1/3]: 0x1566560 in nvme_ctrlr_process_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3802 00:07:28.130 NEW_FUNC[2/3]: 0x1737d90 in spdk_nvme_probe_poll_async /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme.c:1511 00:07:28.130 #9 NEW cov: 11785 ft: 12214 corp: 3/46b lim: 35 exec/s: 0 rss: 70Mb L: 23/23 MS: 1 CrossOver- 00:07:28.130 [2024-04-24 19:14:15.048052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:15.048086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.130 [2024-04-24 19:14:15.048159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:15.048174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.130 [2024-04-24 19:14:15.048227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:15.048241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.130 #10 NEW cov: 11791 ft: 12393 corp: 4/69b lim: 35 exec/s: 0 rss: 70Mb L: 23/23 MS: 1 ShuffleBytes- 00:07:28.130 [2024-04-24 19:14:15.088002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a72003b cdw11:72007272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:15.088028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.130 #14 NEW cov: 11876 ft: 13021 corp: 5/80b lim: 35 exec/s: 0 rss: 71Mb L: 11/23 MS: 4 ShuffleBytes-CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:28.130 [2024-04-24 19:14:15.128186] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.130 [2024-04-24 19:14:15.128404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:15.128430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.130 [2024-04-24 19:14:15.128488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:15.128504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.130 [2024-04-24 19:14:15.128562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.130 [2024-04-24 19:14:15.128579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.388 #15 NEW cov: 11885 ft: 13200 corp: 6/107b lim: 35 exec/s: 0 rss: 71Mb L: 27/27 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:28.388 [2024-04-24 19:14:15.168424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.168450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.388 [2024-04-24 19:14:15.168504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.168519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.388 [2024-04-24 19:14:15.168573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.168587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.388 #16 NEW cov: 11885 ft: 13280 corp: 7/130b lim: 35 exec/s: 0 rss: 71Mb L: 23/27 MS: 1 ShuffleBytes- 00:07:28.388 [2024-04-24 19:14:15.208287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a72003b cdw11:72007272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.208312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.388 #22 NEW cov: 11885 ft: 13320 corp: 8/141b lim: 35 exec/s: 0 rss: 71Mb L: 11/27 MS: 1 CMP- DE: "\016\000\000\000"- 00:07:28.388 [2024-04-24 19:14:15.248455] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.388 [2024-04-24 19:14:15.248692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.248717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.388 [2024-04-24 19:14:15.248772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.248787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.388 [2024-04-24 19:14:15.248840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.248856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.388 #23 NEW cov: 11885 ft: 13353 corp: 9/163b lim: 35 exec/s: 0 rss: 71Mb L: 22/27 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:07:28.388 [2024-04-24 19:14:15.288593] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.388 [2024-04-24 19:14:15.288802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.288827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.388 [2024-04-24 19:14:15.288883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.388 [2024-04-24 19:14:15.288897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.389 [2024-04-24 19:14:15.288954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.288970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.389 #24 NEW cov: 11885 ft: 13374 corp: 10/189b lim: 35 exec/s: 0 rss: 72Mb L: 26/27 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:28.389 [2024-04-24 19:14:15.328732] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.389 [2024-04-24 19:14:15.329049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.329080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.389 [2024-04-24 19:14:15.329138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.329152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.389 [2024-04-24 19:14:15.329208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.329224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.389 [2024-04-24 19:14:15.329281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.329294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.389 #25 NEW cov: 11885 ft: 13927 corp: 11/220b lim: 35 exec/s: 0 rss: 72Mb L: 31/31 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:28.389 [2024-04-24 19:14:15.378869] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.389 [2024-04-24 19:14:15.379182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.379208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.389 [2024-04-24 19:14:15.379264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.379279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.389 [2024-04-24 19:14:15.379332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:07000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.379348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.389 [2024-04-24 19:14:15.379401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.389 [2024-04-24 19:14:15.379415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.647 #26 NEW cov: 11885 ft: 13972 corp: 12/251b lim: 35 exec/s: 0 rss: 72Mb L: 31/31 MS: 1 ChangeBinInt- 00:07:28.647 [2024-04-24 19:14:15.429147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.429172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.429258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d0a006d cdw11:6d006d0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.429272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.429328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.429342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.647 #27 NEW cov: 11885 ft: 14006 corp: 13/274b lim: 35 exec/s: 0 rss: 72Mb L: 23/31 MS: 1 CopyPart- 00:07:28.647 [2024-04-24 19:14:15.469266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.469290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.469347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.469362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.469416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d00006d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.469430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.647 #28 NEW cov: 11885 ft: 14033 corp: 14/301b lim: 35 exec/s: 0 rss: 72Mb L: 27/31 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:28.647 [2024-04-24 19:14:15.509395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.509420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.509476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:30006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.509491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.509561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.509576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.647 #29 NEW cov: 11885 ft: 14049 corp: 15/324b lim: 35 exec/s: 0 rss: 72Mb L: 23/31 MS: 1 InsertByte- 00:07:28.647 [2024-04-24 19:14:15.549498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6c006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.549524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.549581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:30006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.549595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.549651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.549665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.647 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:28.647 #30 NEW cov: 11908 ft: 14122 corp: 16/347b lim: 35 exec/s: 0 rss: 72Mb L: 23/31 MS: 1 ChangeBit- 00:07:28.647 [2024-04-24 19:14:15.599770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.599795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.599850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.599864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.599919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d0030 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.647 [2024-04-24 19:14:15.599933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.647 [2024-04-24 19:14:15.599987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:6d00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.648 [2024-04-24 19:14:15.600000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.648 #31 NEW cov: 11908 ft: 14155 corp: 17/377b lim: 35 exec/s: 0 rss: 72Mb L: 30/31 MS: 1 CrossOver- 00:07:28.648 [2024-04-24 19:14:15.639476] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.648 [2024-04-24 19:14:15.639789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.648 [2024-04-24 19:14:15.639814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.648 [2024-04-24 19:14:15.639872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.648 [2024-04-24 19:14:15.639889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.648 [2024-04-24 19:14:15.639944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.648 [2024-04-24 19:14:15.639958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.648 #32 NEW cov: 11908 ft: 14177 corp: 18/403b lim: 35 exec/s: 32 rss: 72Mb L: 26/31 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:07:28.906 [2024-04-24 19:14:15.680015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d006d cdw11:6d006c6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.680040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.680101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.680115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.680171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.680184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.680239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.680255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.906 #33 NEW cov: 11908 ft: 14198 corp: 19/434b lim: 35 exec/s: 33 rss: 72Mb L: 31/31 MS: 1 CopyPart- 00:07:28.906 [2024-04-24 19:14:15.719969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d4d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.719994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.720051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d0a006d cdw11:6d006d0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.720070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.720125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.720139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.906 #34 NEW cov: 11908 ft: 14248 corp: 20/457b lim: 35 exec/s: 34 rss: 72Mb L: 23/31 MS: 1 ChangeBit- 00:07:28.906 [2024-04-24 19:14:15.759790] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.906 [2024-04-24 19:14:15.760143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.760169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.760224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.760242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.760297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.760311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.906 #35 NEW cov: 11908 ft: 14257 corp: 21/483b lim: 35 exec/s: 35 rss: 72Mb L: 26/31 MS: 1 CrossOver- 00:07:28.906 [2024-04-24 19:14:15.799995] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:28.906 [2024-04-24 19:14:15.800228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.800254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.800310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.800324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.800379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:04000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.800395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.906 #36 NEW cov: 11908 ft: 14276 corp: 22/510b lim: 35 exec/s: 36 rss: 72Mb L: 27/31 MS: 1 CMP- DE: "\000\004"- 00:07:28.906 [2024-04-24 19:14:15.840355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.840383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.840455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.840470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.840523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:00006d0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.840537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.906 #37 NEW cov: 11908 ft: 14289 corp: 23/533b lim: 35 exec/s: 37 rss: 72Mb L: 23/31 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:07:28.906 [2024-04-24 19:14:15.880425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.880450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.880506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:6d00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.880519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.880575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.880589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.906 #38 NEW cov: 11908 ft: 14299 corp: 24/555b lim: 35 exec/s: 38 rss: 72Mb L: 22/31 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:07:28.906 [2024-04-24 19:14:15.920534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.920559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.920633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.920648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.906 [2024-04-24 19:14:15.920705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.906 [2024-04-24 19:14:15.920718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.164 #39 NEW cov: 11908 ft: 14310 corp: 25/577b lim: 35 exec/s: 39 rss: 72Mb L: 22/31 MS: 1 ChangeBit- 00:07:29.164 [2024-04-24 19:14:15.960603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.164 [2024-04-24 19:14:15.960628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.164 [2024-04-24 19:14:15.960701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.164 [2024-04-24 19:14:15.960716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.164 [2024-04-24 19:14:15.960772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.164 [2024-04-24 19:14:15.960789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.164 #40 NEW cov: 11908 ft: 14354 corp: 26/600b lim: 35 exec/s: 40 rss: 72Mb L: 23/31 MS: 1 ShuffleBytes- 00:07:29.164 [2024-04-24 19:14:16.000595] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.164 [2024-04-24 19:14:16.000916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.164 [2024-04-24 19:14:16.000941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.164 [2024-04-24 19:14:16.000996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:00000e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.164 [2024-04-24 19:14:16.001011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.164 [2024-04-24 19:14:16.001070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:6d00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.001088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.001141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:6d00006d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.001156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.165 #41 NEW cov: 11908 ft: 14373 corp: 27/630b lim: 35 exec/s: 41 rss: 73Mb L: 30/31 MS: 1 CrossOver- 00:07:29.165 [2024-04-24 19:14:16.050802] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.165 [2024-04-24 19:14:16.051129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.051155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.051210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.051224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.051279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:07000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.051295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.051351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.051366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.165 #42 NEW cov: 11908 ft: 14382 corp: 28/661b lim: 35 exec/s: 42 rss: 73Mb L: 31/31 MS: 1 ShuffleBytes- 00:07:29.165 [2024-04-24 19:14:16.100917] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.165 [2024-04-24 19:14:16.101240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.101266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.101322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.101341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.101396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.101413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.101469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.101484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.165 #43 NEW cov: 11908 ft: 14394 corp: 29/695b lim: 35 exec/s: 43 rss: 73Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:29.165 [2024-04-24 19:14:16.141338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d006d cdw11:6d006c6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.141364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.141424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.141439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.141494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.141508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.165 [2024-04-24 19:14:16.141562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d0030 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.165 [2024-04-24 19:14:16.141577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.165 #44 NEW cov: 11908 ft: 14398 corp: 30/726b lim: 35 exec/s: 44 rss: 73Mb L: 31/34 MS: 1 ShuffleBytes- 00:07:29.423 [2024-04-24 19:14:16.191512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.191538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.191595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d27006d cdw11:1c008a03 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.191611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.191665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0900006f cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.191680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.191735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.191749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.423 #45 NEW cov: 11908 ft: 14402 corp: 31/757b lim: 35 exec/s: 45 rss: 73Mb L: 31/34 MS: 1 CMP- DE: "'\212\003\034\350o\011\000"- 00:07:29.423 [2024-04-24 19:14:16.231506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d00ed6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.231534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.231591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.231606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.231664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.231678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.423 #46 NEW cov: 11908 ft: 14412 corp: 32/780b lim: 35 exec/s: 46 rss: 73Mb L: 23/34 MS: 1 ChangeBit- 00:07:29.423 [2024-04-24 19:14:16.271605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6c006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.271630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.271703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:30006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.271718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.271775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:17006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.271789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.423 #47 NEW cov: 11908 ft: 14433 corp: 33/803b lim: 35 exec/s: 47 rss: 73Mb L: 23/34 MS: 1 ChangeBinInt- 00:07:29.423 [2024-04-24 19:14:16.311527] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.423 [2024-04-24 19:14:16.311645] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.423 [2024-04-24 19:14:16.311887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.311912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.311970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.311985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.312040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:04ba0000 cdw11:0000baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.312056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.312116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:006d0000 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.312133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.423 #48 NEW cov: 11908 ft: 14466 corp: 34/833b lim: 35 exec/s: 48 rss: 73Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:07:29.423 [2024-04-24 19:14:16.361422] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.423 [2024-04-24 19:14:16.361539] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.423 [2024-04-24 19:14:16.361740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.361770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.361826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.361842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.423 #49 NEW cov: 11908 ft: 14646 corp: 35/851b lim: 35 exec/s: 49 rss: 73Mb L: 18/34 MS: 1 InsertRepeatedBytes- 00:07:29.423 [2024-04-24 19:14:16.401776] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.423 [2024-04-24 19:14:16.402109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0e000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.402135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.402194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.402209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.402263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:07000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.402278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.423 [2024-04-24 19:14:16.402331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.423 [2024-04-24 19:14:16.402346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.423 #50 NEW cov: 11908 ft: 14673 corp: 36/882b lim: 35 exec/s: 50 rss: 73Mb L: 31/34 MS: 1 PersAutoDict- DE: "\016\000\000\000"- 00:07:29.681 [2024-04-24 19:14:16.452226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.452251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.452309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:ff006dff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.452323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.452381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:012900ff cdw11:6d004573 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.452394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.452451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.452465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.681 #51 NEW cov: 11908 ft: 14680 corp: 37/913b lim: 35 exec/s: 51 rss: 73Mb L: 31/34 MS: 1 CMP- DE: "\377\377\377\377\001)Es"- 00:07:29.681 [2024-04-24 19:14:16.492228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d006d cdw11:0a006c6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.492256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.492328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.492343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.492401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.492414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.681 #52 NEW cov: 11908 ft: 14688 corp: 38/937b lim: 35 exec/s: 52 rss: 73Mb L: 24/34 MS: 1 CrossOver- 00:07:29.681 [2024-04-24 19:14:16.532189] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.681 [2024-04-24 19:14:16.532322] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:29.681 [2024-04-24 19:14:16.532550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.532576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.532633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6c006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.532648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.532703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:04ba0000 cdw11:0000baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.532719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.532774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:006d0000 cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.532791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.681 #53 NEW cov: 11908 ft: 14696 corp: 39/967b lim: 35 exec/s: 53 rss: 73Mb L: 30/34 MS: 1 ChangeBit- 00:07:29.681 [2024-04-24 19:14:16.582622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d0a000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.582650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.582711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:ff006dff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.582726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.582785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:012900ff cdw11:93004573 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.582802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.582861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.582875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.681 #54 NEW cov: 11908 ft: 14712 corp: 40/998b lim: 35 exec/s: 54 rss: 73Mb L: 31/34 MS: 1 ChangeBinInt- 00:07:29.681 [2024-04-24 19:14:16.632675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:6d6d000a cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.632701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.632759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.632774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.681 [2024-04-24 19:14:16.632833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:6d6d006d cdw11:6d006d6d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.681 [2024-04-24 19:14:16.632847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.681 #55 NEW cov: 11908 ft: 14784 corp: 41/1020b lim: 35 exec/s: 27 rss: 73Mb L: 22/34 MS: 1 ShuffleBytes- 00:07:29.681 #55 DONE cov: 11908 ft: 14784 corp: 41/1020b lim: 35 exec/s: 27 rss: 73Mb 00:07:29.681 ###### Recommended dictionary. ###### 00:07:29.681 "\000\000\000\000" # Uses: 3 00:07:29.681 "\016\000\000\000" # Uses: 5 00:07:29.681 "\000\004" # Uses: 0 00:07:29.682 "'\212\003\034\350o\011\000" # Uses: 0 00:07:29.682 "\377\377\377\377\001)Es" # Uses: 0 00:07:29.682 ###### End of recommended dictionary. ###### 00:07:29.682 Done 55 runs in 2 second(s) 00:07:29.940 19:14:16 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:29.940 19:14:16 -- ../common.sh@72 -- # (( i++ )) 00:07:29.940 19:14:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:29.940 19:14:16 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:29.940 19:14:16 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:29.940 19:14:16 -- nvmf/run.sh@24 -- # local timen=1 00:07:29.940 19:14:16 -- nvmf/run.sh@25 -- # local core=0x1 00:07:29.940 19:14:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:29.941 19:14:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:29.941 19:14:16 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:29.941 19:14:16 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:29.941 19:14:16 -- nvmf/run.sh@34 -- # printf %02d 3 00:07:29.941 19:14:16 -- nvmf/run.sh@34 -- # port=4403 00:07:29.941 19:14:16 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:29.941 19:14:16 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:29.941 19:14:16 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:29.941 19:14:16 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:29.941 19:14:16 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:29.941 19:14:16 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:29.941 [2024-04-24 19:14:16.821159] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:29.941 [2024-04-24 19:14:16.821233] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621017 ] 00:07:29.941 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.198 [2024-04-24 19:14:17.016140] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.198 [2024-04-24 19:14:17.088874] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.198 [2024-04-24 19:14:17.148100] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.198 [2024-04-24 19:14:17.164308] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:30.198 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.198 INFO: Seed: 2261521285 00:07:30.198 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:30.198 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:30.198 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:30.198 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.198 #2 INITED exec/s: 0 rss: 64Mb 00:07:30.198 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.198 This may also happen if the target rejected all inputs we tried so far 00:07:30.714 NEW_FUNC[1/659]: 0x486d80 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:30.714 NEW_FUNC[2/659]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:30.714 #7 NEW cov: 11553 ft: 11550 corp: 2/6b lim: 20 exec/s: 0 rss: 70Mb L: 5/5 MS: 5 CopyPart-CopyPart-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:30.714 #18 NEW cov: 11683 ft: 11948 corp: 3/11b lim: 20 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:07:30.714 #25 NEW cov: 11689 ft: 12245 corp: 4/15b lim: 20 exec/s: 0 rss: 71Mb L: 4/5 MS: 2 CMP-InsertByte- DE: "\007\000"- 00:07:30.714 #26 NEW cov: 11774 ft: 12624 corp: 5/19b lim: 20 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:30.714 #27 NEW cov: 11774 ft: 12699 corp: 6/23b lim: 20 exec/s: 0 rss: 71Mb L: 4/5 MS: 1 ChangeBit- 00:07:30.714 [2024-04-24 19:14:17.711097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:30.714 [2024-04-24 19:14:17.711140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.971 NEW_FUNC[1/20]: 0x115f4e0 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3282 00:07:30.971 NEW_FUNC[2/20]: 0x1160060 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:30.971 #31 NEW cov: 12112 ft: 13481 corp: 7/37b lim: 20 exec/s: 0 rss: 72Mb L: 14/14 MS: 4 ChangeByte-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:30.971 #37 NEW cov: 12112 ft: 13533 corp: 8/41b lim: 20 exec/s: 0 rss: 72Mb L: 4/14 MS: 1 ShuffleBytes- 00:07:30.971 #38 NEW cov: 12112 ft: 13576 corp: 9/46b lim: 20 exec/s: 0 rss: 72Mb L: 5/14 MS: 1 ChangeBit- 00:07:30.971 #39 NEW cov: 12112 ft: 13593 corp: 10/51b lim: 20 exec/s: 0 rss: 72Mb L: 5/14 MS: 1 InsertByte- 00:07:30.971 #41 NEW cov: 12112 ft: 13700 corp: 11/56b lim: 20 exec/s: 0 rss: 72Mb L: 5/14 MS: 2 EraseBytes-CrossOver- 00:07:30.971 #42 NEW cov: 12112 ft: 13728 corp: 12/61b lim: 20 exec/s: 0 rss: 72Mb L: 5/14 MS: 1 ChangeBinInt- 00:07:31.228 #43 NEW cov: 12112 ft: 13751 corp: 13/65b lim: 20 exec/s: 0 rss: 72Mb L: 4/14 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:31.228 #44 NEW cov: 12112 ft: 13847 corp: 14/72b lim: 20 exec/s: 0 rss: 72Mb L: 7/14 MS: 1 CopyPart- 00:07:31.228 [2024-04-24 19:14:18.042011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:31.228 [2024-04-24 19:14:18.042049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.228 #45 NEW cov: 12112 ft: 13887 corp: 15/87b lim: 20 exec/s: 0 rss: 72Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:07:31.228 [2024-04-24 19:14:18.082266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:31.228 [2024-04-24 19:14:18.082292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.228 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:31.229 #46 NEW cov: 12152 ft: 14169 corp: 16/104b lim: 20 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:07:31.229 #47 NEW cov: 12152 ft: 14185 corp: 17/110b lim: 20 exec/s: 0 rss: 72Mb L: 6/17 MS: 1 InsertByte- 00:07:31.229 #48 NEW cov: 12152 ft: 14190 corp: 18/117b lim: 20 exec/s: 0 rss: 72Mb L: 7/17 MS: 1 CopyPart- 00:07:31.229 #49 NEW cov: 12152 ft: 14203 corp: 19/121b lim: 20 exec/s: 49 rss: 72Mb L: 4/17 MS: 1 CopyPart- 00:07:31.486 [2024-04-24 19:14:18.252716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:31.486 [2024-04-24 19:14:18.252743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.486 #50 NEW cov: 12152 ft: 14248 corp: 20/140b lim: 20 exec/s: 50 rss: 72Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:31.486 #51 NEW cov: 12152 ft: 14287 corp: 21/147b lim: 20 exec/s: 51 rss: 73Mb L: 7/19 MS: 1 ChangeBit- 00:07:31.486 #52 NEW cov: 12152 ft: 14297 corp: 22/154b lim: 20 exec/s: 52 rss: 73Mb L: 7/19 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:31.486 #53 NEW cov: 12152 ft: 14306 corp: 23/160b lim: 20 exec/s: 53 rss: 73Mb L: 6/19 MS: 1 InsertByte- 00:07:31.486 #54 NEW cov: 12153 ft: 14507 corp: 24/168b lim: 20 exec/s: 54 rss: 73Mb L: 8/19 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:31.486 #55 NEW cov: 12153 ft: 14516 corp: 25/174b lim: 20 exec/s: 55 rss: 73Mb L: 6/19 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:31.743 #56 NEW cov: 12153 ft: 14521 corp: 26/179b lim: 20 exec/s: 56 rss: 73Mb L: 5/19 MS: 1 ShuffleBytes- 00:07:31.743 #57 NEW cov: 12153 ft: 14536 corp: 27/186b lim: 20 exec/s: 57 rss: 73Mb L: 7/19 MS: 1 ChangeByte- 00:07:31.743 #58 NEW cov: 12153 ft: 14553 corp: 28/193b lim: 20 exec/s: 58 rss: 73Mb L: 7/19 MS: 1 ChangeByte- 00:07:31.743 #59 NEW cov: 12153 ft: 14561 corp: 29/203b lim: 20 exec/s: 59 rss: 73Mb L: 10/19 MS: 1 CrossOver- 00:07:31.743 #60 NEW cov: 12153 ft: 14570 corp: 30/210b lim: 20 exec/s: 60 rss: 73Mb L: 7/19 MS: 1 ChangeByte- 00:07:31.743 [2024-04-24 19:14:18.703984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:31.743 [2024-04-24 19:14:18.704013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.743 NEW_FUNC[1/2]: 0x10ffa00 in nvmf_ctrlr_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3355 00:07:31.743 NEW_FUNC[2/2]: 0x1100610 in spdk_nvmf_request_get_bdev /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:4775 00:07:31.743 #61 NEW cov: 12183 ft: 14632 corp: 31/227b lim: 20 exec/s: 61 rss: 73Mb L: 17/19 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:32.000 #62 NEW cov: 12183 ft: 14704 corp: 32/232b lim: 20 exec/s: 62 rss: 73Mb L: 5/19 MS: 1 CopyPart- 00:07:32.000 #63 NEW cov: 12183 ft: 14782 corp: 33/239b lim: 20 exec/s: 63 rss: 73Mb L: 7/19 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:32.000 #64 NEW cov: 12183 ft: 14804 corp: 34/244b lim: 20 exec/s: 64 rss: 73Mb L: 5/19 MS: 1 ChangeBit- 00:07:32.000 #65 NEW cov: 12183 ft: 14810 corp: 35/249b lim: 20 exec/s: 65 rss: 73Mb L: 5/19 MS: 1 EraseBytes- 00:07:32.000 [2024-04-24 19:14:18.914592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.001 [2024-04-24 19:14:18.914620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.001 #66 NEW cov: 12183 ft: 14859 corp: 36/266b lim: 20 exec/s: 66 rss: 74Mb L: 17/19 MS: 1 ChangeByte- 00:07:32.001 #67 NEW cov: 12183 ft: 14901 corp: 37/283b lim: 20 exec/s: 67 rss: 74Mb L: 17/19 MS: 1 InsertRepeatedBytes- 00:07:32.258 #68 NEW cov: 12183 ft: 14917 corp: 38/290b lim: 20 exec/s: 68 rss: 74Mb L: 7/19 MS: 1 ChangeBit- 00:07:32.258 #69 NEW cov: 12183 ft: 14921 corp: 39/295b lim: 20 exec/s: 69 rss: 74Mb L: 5/19 MS: 1 ShuffleBytes- 00:07:32.258 #70 NEW cov: 12183 ft: 14932 corp: 40/302b lim: 20 exec/s: 70 rss: 74Mb L: 7/19 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:32.258 #71 NEW cov: 12183 ft: 14941 corp: 41/308b lim: 20 exec/s: 71 rss: 74Mb L: 6/19 MS: 1 EraseBytes- 00:07:32.258 #72 NEW cov: 12183 ft: 14956 corp: 42/312b lim: 20 exec/s: 72 rss: 74Mb L: 4/19 MS: 1 PersAutoDict- DE: "\007\000"- 00:07:32.258 [2024-04-24 19:14:19.195219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.258 [2024-04-24 19:14:19.195253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.258 #73 NEW cov: 12183 ft: 15012 corp: 43/326b lim: 20 exec/s: 36 rss: 74Mb L: 14/19 MS: 1 CrossOver- 00:07:32.258 #73 DONE cov: 12183 ft: 15012 corp: 43/326b lim: 20 exec/s: 36 rss: 74Mb 00:07:32.258 ###### Recommended dictionary. ###### 00:07:32.258 "\007\000" # Uses: 9 00:07:32.258 ###### End of recommended dictionary. ###### 00:07:32.258 Done 73 runs in 2 second(s) 00:07:32.516 19:14:19 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:32.516 19:14:19 -- ../common.sh@72 -- # (( i++ )) 00:07:32.516 19:14:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.516 19:14:19 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:32.516 19:14:19 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:32.516 19:14:19 -- nvmf/run.sh@24 -- # local timen=1 00:07:32.516 19:14:19 -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.516 19:14:19 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:32.516 19:14:19 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:32.516 19:14:19 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:32.516 19:14:19 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:32.516 19:14:19 -- nvmf/run.sh@34 -- # printf %02d 4 00:07:32.516 19:14:19 -- nvmf/run.sh@34 -- # port=4404 00:07:32.516 19:14:19 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:32.516 19:14:19 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:32.516 19:14:19 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.516 19:14:19 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:32.516 19:14:19 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:32.516 19:14:19 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:32.516 [2024-04-24 19:14:19.382115] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:32.516 [2024-04-24 19:14:19.382189] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621367 ] 00:07:32.516 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.774 [2024-04-24 19:14:19.579810] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.774 [2024-04-24 19:14:19.652716] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.774 [2024-04-24 19:14:19.712022] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.774 [2024-04-24 19:14:19.728232] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:32.774 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.774 INFO: Seed: 530556177 00:07:32.774 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:32.774 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:32.774 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:32.774 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.774 #2 INITED exec/s: 0 rss: 64Mb 00:07:32.774 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.774 This may also happen if the target rejected all inputs we tried so far 00:07:32.774 [2024-04-24 19:14:19.783836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.774 [2024-04-24 19:14:19.783864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:32.774 [2024-04-24 19:14:19.783917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.774 [2024-04-24 19:14:19.783935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.774 [2024-04-24 19:14:19.783984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:32.774 [2024-04-24 19:14:19.783998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.287 NEW_FUNC[1/671]: 0x487e70 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:33.287 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:33.287 #12 NEW cov: 11674 ft: 11676 corp: 2/24b lim: 35 exec/s: 0 rss: 70Mb L: 23/23 MS: 5 ShuffleBytes-CrossOver-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:33.287 [2024-04-24 19:14:20.136920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff8aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.287 [2024-04-24 19:14:20.136975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.287 [2024-04-24 19:14:20.137101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.287 [2024-04-24 19:14:20.137122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.287 [2024-04-24 19:14:20.137216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.287 [2024-04-24 19:14:20.137236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.287 #16 NEW cov: 11806 ft: 12181 corp: 3/50b lim: 35 exec/s: 0 rss: 70Mb L: 26/26 MS: 4 ChangeBit-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:07:33.287 [2024-04-24 19:14:20.186783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.287 [2024-04-24 19:14:20.186811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.287 [2024-04-24 19:14:20.186893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.287 [2024-04-24 19:14:20.186907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.287 [2024-04-24 19:14:20.186992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.287 [2024-04-24 19:14:20.187007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.287 #17 NEW cov: 11812 ft: 12494 corp: 4/73b lim: 35 exec/s: 0 rss: 70Mb L: 23/26 MS: 1 CrossOver- 00:07:33.287 [2024-04-24 19:14:20.246906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.287 [2024-04-24 19:14:20.246932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.287 [2024-04-24 19:14:20.247013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.288 [2024-04-24 19:14:20.247029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.288 [2024-04-24 19:14:20.247121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.288 [2024-04-24 19:14:20.247139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.288 #18 NEW cov: 11897 ft: 12792 corp: 5/99b lim: 35 exec/s: 0 rss: 70Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:33.288 [2024-04-24 19:14:20.297215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff8aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.288 [2024-04-24 19:14:20.297240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.288 [2024-04-24 19:14:20.297332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:6d1aff7a cdw11:4bea0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.288 [2024-04-24 19:14:20.297351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.288 [2024-04-24 19:14:20.297433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff0900 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.288 [2024-04-24 19:14:20.297449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.544 #19 NEW cov: 11897 ft: 12850 corp: 6/125b lim: 35 exec/s: 0 rss: 70Mb L: 26/26 MS: 1 CMP- DE: "zm\032K\352o\011\000"- 00:07:33.544 [2024-04-24 19:14:20.357597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.357625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.357710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.357727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.357817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.357833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.357919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.357936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.544 #20 NEW cov: 11897 ft: 13243 corp: 7/157b lim: 35 exec/s: 0 rss: 71Mb L: 32/32 MS: 1 CopyPart- 00:07:33.544 [2024-04-24 19:14:20.417462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff8aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.417489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.417575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff01ffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.417592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.417681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.417697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.544 #21 NEW cov: 11897 ft: 13303 corp: 8/184b lim: 35 exec/s: 0 rss: 71Mb L: 27/32 MS: 1 InsertByte- 00:07:33.544 [2024-04-24 19:14:20.468387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a00ff cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.468414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.468506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.468522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.468610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.468625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.468714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.468729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.468815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.468831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.544 #22 NEW cov: 11897 ft: 13470 corp: 9/219b lim: 35 exec/s: 0 rss: 71Mb L: 35/35 MS: 1 CrossOver- 00:07:33.544 [2024-04-24 19:14:20.527996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.528025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.528115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.528132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.544 [2024-04-24 19:14:20.528221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.544 [2024-04-24 19:14:20.528236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.544 #29 NEW cov: 11897 ft: 13482 corp: 10/241b lim: 35 exec/s: 0 rss: 71Mb L: 22/35 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:33.800 [2024-04-24 19:14:20.578048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.578086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.578178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:fef80000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.578195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.578285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.578301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.800 #30 NEW cov: 11897 ft: 13550 corp: 11/267b lim: 35 exec/s: 0 rss: 71Mb L: 26/35 MS: 1 ChangeBinInt- 00:07:33.800 [2024-04-24 19:14:20.639115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a00ff cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.639144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.639226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.639242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.639335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.639350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.639434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.639449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.639541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.639556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.800 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:33.800 #31 NEW cov: 11920 ft: 13617 corp: 12/302b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:33.800 [2024-04-24 19:14:20.699299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a00ff cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.699324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.699418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.699433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.699518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.699533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.699622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.699637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.699724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.699740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:33.800 #32 NEW cov: 11920 ft: 13639 corp: 13/337b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:33.800 [2024-04-24 19:14:20.748998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.749023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.749118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.749134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.749224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:4bea6d1a cdw11:6f090000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.749239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.800 #33 NEW cov: 11920 ft: 13653 corp: 14/360b lim: 35 exec/s: 33 rss: 72Mb L: 23/35 MS: 1 PersAutoDict- DE: "zm\032K\352o\011\000"- 00:07:33.800 [2024-04-24 19:14:20.798971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.798996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.800 [2024-04-24 19:14:20.799079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.800 [2024-04-24 19:14:20.799108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.801 [2024-04-24 19:14:20.799191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.801 [2024-04-24 19:14:20.799205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.147 #34 NEW cov: 11920 ft: 13679 corp: 15/382b lim: 35 exec/s: 34 rss: 72Mb L: 22/35 MS: 1 ShuffleBytes- 00:07:34.147 [2024-04-24 19:14:20.859884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a00ff cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.859908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.859990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.860006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.860095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.860111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.860197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.860212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.860295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.860310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.147 #35 NEW cov: 11920 ft: 13727 corp: 16/417b lim: 35 exec/s: 35 rss: 72Mb L: 35/35 MS: 1 CrossOver- 00:07:34.147 [2024-04-24 19:14:20.919386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff8aff cdw11:ffa70003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.919410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.919502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:7a6dffff cdw11:1a4b0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.919517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.919604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00ff6f09 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.919621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.147 #36 NEW cov: 11920 ft: 13758 corp: 17/444b lim: 35 exec/s: 36 rss: 72Mb L: 27/35 MS: 1 InsertByte- 00:07:34.147 [2024-04-24 19:14:20.979919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.979944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.980035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.980050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.980147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.980161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:20.980243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:20.980258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.147 #37 NEW cov: 11920 ft: 13765 corp: 18/476b lim: 35 exec/s: 37 rss: 72Mb L: 32/35 MS: 1 ChangeBit- 00:07:34.147 [2024-04-24 19:14:21.030412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a00ff cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.030436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.030517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.030532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.030618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.030633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.030722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.030738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.030822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.030837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.147 #38 NEW cov: 11920 ft: 13786 corp: 19/511b lim: 35 exec/s: 38 rss: 72Mb L: 35/35 MS: 1 ChangeByte- 00:07:34.147 [2024-04-24 19:14:21.080474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.080499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.080592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.080609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.080687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:24000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.080701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.080783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.080798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.147 #39 NEW cov: 11920 ft: 13798 corp: 20/544b lim: 35 exec/s: 39 rss: 72Mb L: 33/35 MS: 1 InsertByte- 00:07:34.147 [2024-04-24 19:14:21.140210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.140234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.140317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:fef80000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.140333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.147 [2024-04-24 19:14:21.140411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.147 [2024-04-24 19:14:21.140427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.434 #40 NEW cov: 11920 ft: 13810 corp: 21/570b lim: 35 exec/s: 40 rss: 72Mb L: 26/35 MS: 1 ChangeByte- 00:07:34.434 [2024-04-24 19:14:21.200555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.200580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.200672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.200688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.200778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00fe0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.200793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.434 #41 NEW cov: 11920 ft: 13819 corp: 22/596b lim: 35 exec/s: 41 rss: 72Mb L: 26/35 MS: 1 ChangeBinInt- 00:07:34.434 [2024-04-24 19:14:21.250639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:08171717 cdw11:3fff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.250664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.250753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.250768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.250856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.250871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.434 #45 NEW cov: 11920 ft: 13861 corp: 23/620b lim: 35 exec/s: 45 rss: 72Mb L: 24/35 MS: 4 CrossOver-ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:34.434 [2024-04-24 19:14:21.310730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:17171717 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.310754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.310840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00001717 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.310856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.310939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:16170000 cdw11:17170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.310953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.434 #46 NEW cov: 11920 ft: 13883 corp: 24/642b lim: 35 exec/s: 46 rss: 72Mb L: 22/35 MS: 1 ChangeBinInt- 00:07:34.434 [2024-04-24 19:14:21.361003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:f6ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.361027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.361119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:01f80000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.361135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.361226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.361242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.434 #47 NEW cov: 11920 ft: 13901 corp: 25/668b lim: 35 exec/s: 47 rss: 72Mb L: 26/35 MS: 1 ChangeBinInt- 00:07:34.434 [2024-04-24 19:14:21.411492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.411517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.411610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:fef80000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.411626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.411713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.411728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.434 [2024-04-24 19:14:21.411823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00005e5e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.434 [2024-04-24 19:14:21.411839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.434 #48 NEW cov: 11920 ft: 13913 corp: 26/699b lim: 35 exec/s: 48 rss: 72Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:07:34.691 [2024-04-24 19:14:21.472016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff8aff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.691 [2024-04-24 19:14:21.472041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.691 [2024-04-24 19:14:21.472125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.691 [2024-04-24 19:14:21.472142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.691 [2024-04-24 19:14:21.472227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffa5ff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.691 [2024-04-24 19:14:21.472242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.691 [2024-04-24 19:14:21.472328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.691 [2024-04-24 19:14:21.472343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.691 [2024-04-24 19:14:21.472434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.691 [2024-04-24 19:14:21.472449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.691 #49 NEW cov: 11920 ft: 13928 corp: 27/734b lim: 35 exec/s: 49 rss: 72Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:34.691 [2024-04-24 19:14:21.521666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.691 [2024-04-24 19:14:21.521691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.691 [2024-04-24 19:14:21.521779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00ff0000 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.691 [2024-04-24 19:14:21.521795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.691 [2024-04-24 19:14:21.521889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.521903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.692 #50 NEW cov: 11920 ft: 13952 corp: 28/761b lim: 35 exec/s: 50 rss: 72Mb L: 27/35 MS: 1 InsertByte- 00:07:34.692 [2024-04-24 19:14:21.581805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.581831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.581929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.581949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.582030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff170000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.582045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.692 #51 NEW cov: 11920 ft: 13976 corp: 29/784b lim: 35 exec/s: 51 rss: 73Mb L: 23/35 MS: 1 CMP- DE: "\377\027"- 00:07:34.692 [2024-04-24 19:14:21.632649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ff0a00ff cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.632677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.632782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.632797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.632881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.632900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.632985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.633001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.633086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.633103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:34.692 #52 NEW cov: 11920 ft: 13985 corp: 30/819b lim: 35 exec/s: 52 rss: 73Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:34.692 [2024-04-24 19:14:21.692185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00fe0a0a cdw11:f8000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.692213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.692302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:005e0000 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.692319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.692 [2024-04-24 19:14:21.692400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00005e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.692 [2024-04-24 19:14:21.692415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.948 #53 NEW cov: 11920 ft: 13993 corp: 31/842b lim: 35 exec/s: 53 rss: 73Mb L: 23/35 MS: 1 EraseBytes- 00:07:34.948 [2024-04-24 19:14:21.752397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.948 [2024-04-24 19:14:21.752424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.948 [2024-04-24 19:14:21.752512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.948 [2024-04-24 19:14:21.752530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.948 [2024-04-24 19:14:21.752620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.948 [2024-04-24 19:14:21.752636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.948 #54 NEW cov: 11920 ft: 14026 corp: 32/869b lim: 35 exec/s: 27 rss: 73Mb L: 27/35 MS: 1 EraseBytes- 00:07:34.948 #54 DONE cov: 11920 ft: 14026 corp: 32/869b lim: 35 exec/s: 27 rss: 73Mb 00:07:34.948 ###### Recommended dictionary. ###### 00:07:34.948 "zm\032K\352o\011\000" # Uses: 1 00:07:34.948 "\377\027" # Uses: 0 00:07:34.948 ###### End of recommended dictionary. ###### 00:07:34.948 Done 54 runs in 2 second(s) 00:07:34.948 19:14:21 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:34.948 19:14:21 -- ../common.sh@72 -- # (( i++ )) 00:07:34.948 19:14:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.948 19:14:21 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:34.948 19:14:21 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:34.948 19:14:21 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.948 19:14:21 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.948 19:14:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:34.948 19:14:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:34.948 19:14:21 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:34.948 19:14:21 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:34.948 19:14:21 -- nvmf/run.sh@34 -- # printf %02d 5 00:07:34.948 19:14:21 -- nvmf/run.sh@34 -- # port=4405 00:07:34.948 19:14:21 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:34.948 19:14:21 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:34.948 19:14:21 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.948 19:14:21 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:34.948 19:14:21 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:34.948 19:14:21 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:34.949 [2024-04-24 19:14:21.926343] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:34.949 [2024-04-24 19:14:21.926417] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621707 ] 00:07:34.949 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.205 [2024-04-24 19:14:22.124383] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.205 [2024-04-24 19:14:22.196465] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.463 [2024-04-24 19:14:22.255777] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.463 [2024-04-24 19:14:22.271979] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:35.463 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.463 INFO: Seed: 3074552580 00:07:35.463 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:35.463 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:35.463 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:35.463 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.463 #2 INITED exec/s: 0 rss: 64Mb 00:07:35.463 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.463 This may also happen if the target rejected all inputs we tried so far 00:07:35.463 [2024-04-24 19:14:22.327307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.463 [2024-04-24 19:14:22.327335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.721 NEW_FUNC[1/671]: 0x48a000 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:35.721 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.721 #4 NEW cov: 11687 ft: 11674 corp: 2/11b lim: 45 exec/s: 0 rss: 71Mb L: 10/10 MS: 2 CrossOver-CMP- DE: "\000\011o\353a\361~~"- 00:07:35.721 [2024-04-24 19:14:22.658363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:46460a0a cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.721 [2024-04-24 19:14:22.658406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.721 [2024-04-24 19:14:22.658465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:46464646 cdw11:46000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.721 [2024-04-24 19:14:22.658481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.721 #7 NEW cov: 11817 ft: 12878 corp: 3/29b lim: 45 exec/s: 0 rss: 71Mb L: 18/18 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:07:35.721 [2024-04-24 19:14:22.708455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.721 [2024-04-24 19:14:22.708482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.721 [2024-04-24 19:14:22.708539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:02290002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.721 [2024-04-24 19:14:22.708554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.721 #8 NEW cov: 11823 ft: 13079 corp: 4/47b lim: 45 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 CMP- DE: "\001\000\000\000\002)Es"- 00:07:35.979 [2024-04-24 19:14:22.748726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.748753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.748810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.748824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.748882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff020001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.748895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.979 #9 NEW cov: 11908 ft: 13541 corp: 5/75b lim: 45 exec/s: 0 rss: 72Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:35.979 [2024-04-24 19:14:22.799002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.799027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.799104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.799134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.799188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.799202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.799254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff020001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.799267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.979 #10 NEW cov: 11908 ft: 14063 corp: 6/112b lim: 45 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:35.979 [2024-04-24 19:14:22.848624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.848650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.979 #11 NEW cov: 11908 ft: 14188 corp: 7/122b lim: 45 exec/s: 0 rss: 72Mb L: 10/37 MS: 1 ShuffleBytes- 00:07:35.979 [2024-04-24 19:14:22.889265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.889292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.889364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.889378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.889435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.889449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.979 [2024-04-24 19:14:22.889504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff020001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.979 [2024-04-24 19:14:22.889518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.979 #12 NEW cov: 11908 ft: 14271 corp: 8/159b lim: 45 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 CrossOver- 00:07:35.980 [2024-04-24 19:14:22.939261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.980 [2024-04-24 19:14:22.939286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.980 [2024-04-24 19:14:22.939343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.980 [2024-04-24 19:14:22.939357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.980 [2024-04-24 19:14:22.939429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fffff7ff cdw11:ff020001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.980 [2024-04-24 19:14:22.939444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.980 #13 NEW cov: 11908 ft: 14335 corp: 9/187b lim: 45 exec/s: 0 rss: 72Mb L: 28/37 MS: 1 ChangeBit- 00:07:35.980 [2024-04-24 19:14:22.979155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.980 [2024-04-24 19:14:22.979183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.980 [2024-04-24 19:14:22.979238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.980 [2024-04-24 19:14:22.979253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.238 #14 NEW cov: 11908 ft: 14434 corp: 10/210b lim: 45 exec/s: 0 rss: 72Mb L: 23/37 MS: 1 CrossOver- 00:07:36.238 [2024-04-24 19:14:23.019757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.238 [2024-04-24 19:14:23.019782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.019837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.019851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.019905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.019918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.019972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff25ffff cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.019985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.020038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff2525 cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.020052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.239 #15 NEW cov: 11908 ft: 14508 corp: 11/255b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:07:36.239 [2024-04-24 19:14:23.069439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a0a1900 cdw11:00090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.069464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.069521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0100f17e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.069534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.239 #16 NEW cov: 11908 ft: 14598 corp: 12/275b lim: 45 exec/s: 0 rss: 72Mb L: 20/45 MS: 1 CMP- DE: "\031\000"- 00:07:36.239 [2024-04-24 19:14:23.110054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.110083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.110139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.110153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.110209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fff1ffff cdw11:7e010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.110225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.110278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.110292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.110347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.110360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.239 #17 NEW cov: 11908 ft: 14623 corp: 13/320b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 CopyPart- 00:07:36.239 [2024-04-24 19:14:23.160137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.160163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.160248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.160262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.160317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fff1ffff cdw11:7e010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.160330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.160384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff3b0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.160397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.160453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.160467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.239 #18 NEW cov: 11908 ft: 14646 corp: 14/365b lim: 45 exec/s: 0 rss: 72Mb L: 45/45 MS: 1 ChangeByte- 00:07:36.239 [2024-04-24 19:14:23.210282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:04090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.210306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.210363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.210377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.210447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fff1ffff cdw11:7e010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.210462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.210515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff3b0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.210532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.239 [2024-04-24 19:14:23.210585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.239 [2024-04-24 19:14:23.210599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.239 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:36.239 #19 NEW cov: 11931 ft: 14693 corp: 15/410b lim: 45 exec/s: 0 rss: 73Mb L: 45/45 MS: 1 ChangeBit- 00:07:36.498 [2024-04-24 19:14:23.260295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:096f0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.498 [2024-04-24 19:14:23.260320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.498 [2024-04-24 19:14:23.260376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00007e01 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.498 [2024-04-24 19:14:23.260390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.498 [2024-04-24 19:14:23.260448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.498 [2024-04-24 19:14:23.260462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.498 [2024-04-24 19:14:23.260517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.260531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.499 #20 NEW cov: 11931 ft: 14712 corp: 16/453b lim: 45 exec/s: 0 rss: 73Mb L: 43/45 MS: 1 CrossOver- 00:07:36.499 [2024-04-24 19:14:23.300468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.300493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.300566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.300580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.300635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.300649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.300705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0000ff00 cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.300719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.499 #21 NEW cov: 11931 ft: 14743 corp: 17/495b lim: 45 exec/s: 21 rss: 73Mb L: 42/45 MS: 1 InsertRepeatedBytes- 00:07:36.499 [2024-04-24 19:14:23.340399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a0a1900 cdw11:00090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.340424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.340481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0100f17e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.340498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.340552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:45730229 cdw11:7e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.340565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.499 #22 NEW cov: 11931 ft: 14838 corp: 18/529b lim: 45 exec/s: 22 rss: 73Mb L: 34/45 MS: 1 CrossOver- 00:07:36.499 [2024-04-24 19:14:23.390855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.390880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.390952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.390967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.391024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fff1ffff cdw11:7e010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.391037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.391095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.391109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.391173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:0a290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.391187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.499 #23 NEW cov: 11931 ft: 14849 corp: 19/574b lim: 45 exec/s: 23 rss: 73Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:36.499 [2024-04-24 19:14:23.430436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a0a1900 cdw11:00090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.430460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.430533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:01fff17e cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.430547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.499 #24 NEW cov: 11931 ft: 14910 corp: 20/594b lim: 45 exec/s: 24 rss: 73Mb L: 20/45 MS: 1 CMP- DE: "\377\377"- 00:07:36.499 [2024-04-24 19:14:23.470405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00290a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.470430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.499 #25 NEW cov: 11931 ft: 14956 corp: 21/604b lim: 45 exec/s: 25 rss: 73Mb L: 10/45 MS: 1 ChangeBit- 00:07:36.499 [2024-04-24 19:14:23.510667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:46460a0a cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.510692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.499 [2024-04-24 19:14:23.510753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ff4646ff cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.499 [2024-04-24 19:14:23.510768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.759 #26 NEW cov: 11931 ft: 14965 corp: 22/624b lim: 45 exec/s: 26 rss: 73Mb L: 20/45 MS: 1 PersAutoDict- DE: "\377\377"- 00:07:36.759 [2024-04-24 19:14:23.561270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.561295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.759 [2024-04-24 19:14:23.561365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:fdff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.561379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.759 [2024-04-24 19:14:23.561435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.561449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.759 [2024-04-24 19:14:23.561504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff25ffff cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.561517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.759 [2024-04-24 19:14:23.561572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff2525 cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.561586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.759 #27 NEW cov: 11931 ft: 14984 corp: 23/669b lim: 45 exec/s: 27 rss: 73Mb L: 45/45 MS: 1 ChangeBit- 00:07:36.759 [2024-04-24 19:14:23.600955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a0a1900 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.600980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.759 [2024-04-24 19:14:23.601036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:eb61096f cdw11:f17e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.601051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.759 [2024-04-24 19:14:23.641049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a0a1900 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.641079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.759 [2024-04-24 19:14:23.641152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:eb3c096f cdw11:61f10003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.759 [2024-04-24 19:14:23.641166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.759 #29 NEW cov: 11931 ft: 15027 corp: 24/694b lim: 45 exec/s: 29 rss: 73Mb L: 25/45 MS: 2 InsertRepeatedBytes-InsertByte- 00:07:36.760 [2024-04-24 19:14:23.681116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:46460a0a cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.760 [2024-04-24 19:14:23.681140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.760 [2024-04-24 19:14:23.681215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:46464646 cdw11:46000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.760 [2024-04-24 19:14:23.681230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.760 #30 NEW cov: 11931 ft: 15041 corp: 25/713b lim: 45 exec/s: 30 rss: 73Mb L: 19/45 MS: 1 InsertByte- 00:07:36.760 [2024-04-24 19:14:23.721082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.760 [2024-04-24 19:14:23.721107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.760 #31 NEW cov: 11931 ft: 15134 corp: 26/729b lim: 45 exec/s: 31 rss: 73Mb L: 16/45 MS: 1 EraseBytes- 00:07:36.760 [2024-04-24 19:14:23.761679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:096f0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.760 [2024-04-24 19:14:23.761705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.760 [2024-04-24 19:14:23.761762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00007e01 cdw11:00ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.760 [2024-04-24 19:14:23.761777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.760 [2024-04-24 19:14:23.761833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.760 [2024-04-24 19:14:23.761848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.760 [2024-04-24 19:14:23.761904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:fffffffd cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.760 [2024-04-24 19:14:23.761918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.018 #32 NEW cov: 11931 ft: 15143 corp: 27/772b lim: 45 exec/s: 32 rss: 73Mb L: 43/45 MS: 1 ChangeBit- 00:07:37.018 [2024-04-24 19:14:23.811971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.018 [2024-04-24 19:14:23.811996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.018 [2024-04-24 19:14:23.812069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.018 [2024-04-24 19:14:23.812084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.018 [2024-04-24 19:14:23.812137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.018 [2024-04-24 19:14:23.812151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.018 [2024-04-24 19:14:23.812216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff250100 cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.018 [2024-04-24 19:14:23.812230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.018 [2024-04-24 19:14:23.812284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff2525 cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.018 [2024-04-24 19:14:23.812298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.018 #33 NEW cov: 11931 ft: 15159 corp: 28/817b lim: 45 exec/s: 33 rss: 73Mb L: 45/45 MS: 1 CMP- DE: "\000\000\001\000"- 00:07:37.018 [2024-04-24 19:14:23.851580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.018 [2024-04-24 19:14:23.851605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.018 [2024-04-24 19:14:23.851677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ec32096f cdw11:a0740000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.018 [2024-04-24 19:14:23.851691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.018 #34 NEW cov: 11931 ft: 15214 corp: 29/835b lim: 45 exec/s: 34 rss: 73Mb L: 18/45 MS: 1 CMP- DE: "\000\011o\3542\240t\024"- 00:07:37.018 [2024-04-24 19:14:23.892215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.892238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.892294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:fdff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.892308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.892361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.892375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.892431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff25ffff cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.892444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.892498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff2525 cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.892511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.019 #35 NEW cov: 11931 ft: 15226 corp: 30/880b lim: 45 exec/s: 35 rss: 73Mb L: 45/45 MS: 1 ShuffleBytes- 00:07:37.019 [2024-04-24 19:14:23.941854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.941878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.941952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffff0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.941967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.019 #36 NEW cov: 11931 ft: 15241 corp: 31/904b lim: 45 exec/s: 36 rss: 73Mb L: 24/45 MS: 1 CrossOver- 00:07:37.019 [2024-04-24 19:14:23.992389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.992413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.992486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:fdff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.992504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.992561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:04ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.992575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.992630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff25ffff cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.992644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.019 [2024-04-24 19:14:23.992698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff2525 cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.019 [2024-04-24 19:14:23.992712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.019 #37 NEW cov: 11931 ft: 15256 corp: 32/949b lim: 45 exec/s: 37 rss: 74Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:37.277 [2024-04-24 19:14:24.042604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.042630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.042689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0a000100 cdw11:096f0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.042703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.042762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00007e01 cdw11:00fd0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.042776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.042832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.042845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.042901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.042915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.277 #38 NEW cov: 11931 ft: 15267 corp: 33/994b lim: 45 exec/s: 38 rss: 74Mb L: 45/45 MS: 1 CopyPart- 00:07:37.277 [2024-04-24 19:14:24.082673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:04090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.082698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.082769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.082784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.082840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fff1fbff cdw11:7e010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.082853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.082911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff3b0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.082925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.082979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.082993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.277 #39 NEW cov: 11931 ft: 15273 corp: 34/1039b lim: 45 exec/s: 39 rss: 74Mb L: 45/45 MS: 1 ChangeBinInt- 00:07:37.277 [2024-04-24 19:14:24.132503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.132529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.132586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.132600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.132659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:f7ffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.132672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.277 #40 NEW cov: 11931 ft: 15285 corp: 35/1069b lim: 45 exec/s: 40 rss: 74Mb L: 30/45 MS: 1 CrossOver- 00:07:37.277 [2024-04-24 19:14:24.172442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.172467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.172526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.172539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.277 #41 NEW cov: 11931 ft: 15307 corp: 36/1095b lim: 45 exec/s: 41 rss: 74Mb L: 26/45 MS: 1 EraseBytes- 00:07:37.277 [2024-04-24 19:14:24.213028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.213053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.213144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.213158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.213211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.213225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.277 [2024-04-24 19:14:24.213281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff250100 cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.277 [2024-04-24 19:14:24.213294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.278 [2024-04-24 19:14:24.213354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:ffff2525 cdw11:02290000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.278 [2024-04-24 19:14:24.213367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.278 #42 NEW cov: 11931 ft: 15346 corp: 37/1140b lim: 45 exec/s: 42 rss: 74Mb L: 45/45 MS: 1 CopyPart- 00:07:37.278 [2024-04-24 19:14:24.263036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:096f0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.278 [2024-04-24 19:14:24.263064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.278 [2024-04-24 19:14:24.263148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00007e01 cdw11:ff000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.278 [2024-04-24 19:14:24.263163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.278 [2024-04-24 19:14:24.263236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.278 [2024-04-24 19:14:24.263251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.278 [2024-04-24 19:14:24.263307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:25250001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.278 [2024-04-24 19:14:24.263320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.278 #43 NEW cov: 11931 ft: 15364 corp: 38/1183b lim: 45 exec/s: 43 rss: 74Mb L: 43/45 MS: 1 ShuffleBytes- 00:07:37.536 [2024-04-24 19:14:24.302855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00090a0a cdw11:6feb0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.536 [2024-04-24 19:14:24.302879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.536 [2024-04-24 19:14:24.302951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.536 [2024-04-24 19:14:24.302966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.536 #44 NEW cov: 11931 ft: 15371 corp: 39/1209b lim: 45 exec/s: 22 rss: 74Mb L: 26/45 MS: 1 ChangeBinInt- 00:07:37.536 #44 DONE cov: 11931 ft: 15371 corp: 39/1209b lim: 45 exec/s: 22 rss: 74Mb 00:07:37.536 ###### Recommended dictionary. ###### 00:07:37.536 "\000\011o\353a\361~~" # Uses: 0 00:07:37.536 "\001\000\000\000\002)Es" # Uses: 0 00:07:37.536 "\031\000" # Uses: 0 00:07:37.536 "\377\377" # Uses: 1 00:07:37.536 "\000\000\001\000" # Uses: 0 00:07:37.536 "\000\011o\3542\240t\024" # Uses: 0 00:07:37.536 ###### End of recommended dictionary. ###### 00:07:37.536 Done 44 runs in 2 second(s) 00:07:37.536 19:14:24 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:07:37.536 19:14:24 -- ../common.sh@72 -- # (( i++ )) 00:07:37.536 19:14:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.536 19:14:24 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:37.536 19:14:24 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:37.536 19:14:24 -- nvmf/run.sh@24 -- # local timen=1 00:07:37.536 19:14:24 -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.536 19:14:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:37.536 19:14:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:37.536 19:14:24 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:37.536 19:14:24 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:37.536 19:14:24 -- nvmf/run.sh@34 -- # printf %02d 6 00:07:37.536 19:14:24 -- nvmf/run.sh@34 -- # port=4406 00:07:37.536 19:14:24 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:37.536 19:14:24 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:37.536 19:14:24 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.536 19:14:24 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:37.536 19:14:24 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:37.536 19:14:24 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:07:37.536 [2024-04-24 19:14:24.500709] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:37.536 [2024-04-24 19:14:24.500785] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621996 ] 00:07:37.536 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.794 [2024-04-24 19:14:24.707901] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.794 [2024-04-24 19:14:24.781341] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.053 [2024-04-24 19:14:24.840857] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.053 [2024-04-24 19:14:24.857029] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:38.053 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.053 INFO: Seed: 1362595282 00:07:38.053 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:38.053 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:38.053 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:38.053 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.053 #2 INITED exec/s: 0 rss: 64Mb 00:07:38.053 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.053 This may also happen if the target rejected all inputs we tried so far 00:07:38.053 [2024-04-24 19:14:24.914309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005d26 cdw11:00000000 00:07:38.053 [2024-04-24 19:14:24.914338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.311 NEW_FUNC[1/669]: 0x48c810 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:38.311 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.311 #4 NEW cov: 11604 ft: 11604 corp: 2/3b lim: 10 exec/s: 0 rss: 70Mb L: 2/2 MS: 2 ChangeByte-InsertByte- 00:07:38.311 [2024-04-24 19:14:25.245437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000265d cdw11:00000000 00:07:38.311 [2024-04-24 19:14:25.245511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.311 #5 NEW cov: 11734 ft: 12251 corp: 3/6b lim: 10 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 CopyPart- 00:07:38.311 [2024-04-24 19:14:25.295308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000890a cdw11:00000000 00:07:38.311 [2024-04-24 19:14:25.295333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.311 #7 NEW cov: 11740 ft: 12562 corp: 4/8b lim: 10 exec/s: 0 rss: 71Mb L: 2/3 MS: 2 ShuffleBytes-InsertByte- 00:07:38.569 [2024-04-24 19:14:25.335382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005d26 cdw11:00000000 00:07:38.569 [2024-04-24 19:14:25.335407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.569 #8 NEW cov: 11825 ft: 12855 corp: 5/10b lim: 10 exec/s: 0 rss: 71Mb L: 2/3 MS: 1 EraseBytes- 00:07:38.569 [2024-04-24 19:14:25.375508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000265d cdw11:00000000 00:07:38.569 [2024-04-24 19:14:25.375535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.569 #9 NEW cov: 11825 ft: 13080 corp: 6/13b lim: 10 exec/s: 0 rss: 71Mb L: 3/3 MS: 1 ChangeBit- 00:07:38.569 [2024-04-24 19:14:25.415600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000da99 cdw11:00000000 00:07:38.569 [2024-04-24 19:14:25.415624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.569 #10 NEW cov: 11825 ft: 13138 corp: 7/16b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 ChangeBinInt- 00:07:38.569 [2024-04-24 19:14:25.455857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005d26 cdw11:00000000 00:07:38.569 [2024-04-24 19:14:25.455882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.569 [2024-04-24 19:14:25.455938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005d26 cdw11:00000000 00:07:38.569 [2024-04-24 19:14:25.455952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.569 #11 NEW cov: 11825 ft: 13360 corp: 8/20b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 CopyPart- 00:07:38.569 [2024-04-24 19:14:25.505885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:38.569 [2024-04-24 19:14:25.505909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.569 #14 NEW cov: 11825 ft: 13427 corp: 9/22b lim: 10 exec/s: 0 rss: 72Mb L: 2/4 MS: 3 ShuffleBytes-ChangeBit-CopyPart- 00:07:38.569 [2024-04-24 19:14:25.546027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003089 cdw11:00000000 00:07:38.569 [2024-04-24 19:14:25.546051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.569 #15 NEW cov: 11825 ft: 13450 corp: 10/25b lim: 10 exec/s: 0 rss: 72Mb L: 3/4 MS: 1 InsertByte- 00:07:38.827 [2024-04-24 19:14:25.586137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004aca cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.586162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.827 #16 NEW cov: 11825 ft: 13488 corp: 11/27b lim: 10 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 ChangeBit- 00:07:38.827 [2024-04-24 19:14:25.626281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005d66 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.626305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.827 #17 NEW cov: 11825 ft: 13523 corp: 12/29b lim: 10 exec/s: 0 rss: 72Mb L: 2/4 MS: 1 ChangeBit- 00:07:38.827 [2024-04-24 19:14:25.666457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000daf4 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.666482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.666553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009966 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.666567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.827 #18 NEW cov: 11825 ft: 13537 corp: 13/33b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 InsertByte- 00:07:38.827 [2024-04-24 19:14:25.706851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000265d cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.706879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.706951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006655 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.706966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.707020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005555 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.707034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.707088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005555 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.707102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.827 #19 NEW cov: 11825 ft: 13829 corp: 14/42b lim: 10 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:38.827 [2024-04-24 19:14:25.746854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000077ec cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.746880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.827 #20 NEW cov: 11825 ft: 13859 corp: 15/44b lim: 10 exec/s: 0 rss: 72Mb L: 2/9 MS: 1 ChangeBinInt- 00:07:38.827 [2024-04-24 19:14:25.786749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002625 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.786774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.827 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.827 #21 NEW cov: 11848 ft: 13891 corp: 16/47b lim: 10 exec/s: 0 rss: 72Mb L: 3/9 MS: 1 ChangeByte- 00:07:38.827 [2024-04-24 19:14:25.827315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000026ff cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.827340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.827410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.827424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.827478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.827491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.827546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.827559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.827 [2024-04-24 19:14:25.827615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00002526 cdw11:00000000 00:07:38.827 [2024-04-24 19:14:25.827629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.085 #22 NEW cov: 11848 ft: 13953 corp: 17/57b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:39.085 [2024-04-24 19:14:25.876986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.877012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.085 #23 NEW cov: 11848 ft: 13965 corp: 18/59b lim: 10 exec/s: 0 rss: 72Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:39.085 [2024-04-24 19:14:25.917238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000899b cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.917264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.085 [2024-04-24 19:14:25.917333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009b9b cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.917347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.085 #24 NEW cov: 11848 ft: 14041 corp: 19/64b lim: 10 exec/s: 24 rss: 72Mb L: 5/10 MS: 1 InsertRepeatedBytes- 00:07:39.085 [2024-04-24 19:14:25.957589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000899b cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.957614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.085 [2024-04-24 19:14:25.957685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009bff cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.957700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.085 [2024-04-24 19:14:25.957755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.957768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.085 [2024-04-24 19:14:25.957820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ff9b cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.957834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.085 #25 NEW cov: 11848 ft: 14044 corp: 20/73b lim: 10 exec/s: 25 rss: 72Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:39.085 [2024-04-24 19:14:25.997455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.997480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.085 [2024-04-24 19:14:25.997550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000da99 cdw11:00000000 00:07:39.085 [2024-04-24 19:14:25.997564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.085 #26 NEW cov: 11848 ft: 14058 corp: 21/78b lim: 10 exec/s: 26 rss: 73Mb L: 5/10 MS: 1 CMP- DE: "\000\000"- 00:07:39.085 [2024-04-24 19:14:26.037431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.085 [2024-04-24 19:14:26.037457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.085 #27 NEW cov: 11848 ft: 14198 corp: 22/80b lim: 10 exec/s: 27 rss: 73Mb L: 2/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:39.085 [2024-04-24 19:14:26.077681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000899b cdw11:00000000 00:07:39.085 [2024-04-24 19:14:26.077706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.086 [2024-04-24 19:14:26.077775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009c9b cdw11:00000000 00:07:39.086 [2024-04-24 19:14:26.077789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.086 #28 NEW cov: 11848 ft: 14220 corp: 23/85b lim: 10 exec/s: 28 rss: 73Mb L: 5/10 MS: 1 ChangeBinInt- 00:07:39.344 [2024-04-24 19:14:26.117666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a4b cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.117690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.344 #29 NEW cov: 11848 ft: 14279 corp: 24/87b lim: 10 exec/s: 29 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:07:39.344 [2024-04-24 19:14:26.157791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005d26 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.157816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.344 #30 NEW cov: 11848 ft: 14281 corp: 25/90b lim: 10 exec/s: 30 rss: 73Mb L: 3/10 MS: 1 CrossOver- 00:07:39.344 [2024-04-24 19:14:26.198012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.198037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.344 [2024-04-24 19:14:26.198111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000265d cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.198126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.344 #31 NEW cov: 11848 ft: 14317 corp: 26/95b lim: 10 exec/s: 31 rss: 73Mb L: 5/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:39.344 [2024-04-24 19:14:26.238019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000300a cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.238043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.344 #32 NEW cov: 11848 ft: 14372 corp: 27/97b lim: 10 exec/s: 32 rss: 73Mb L: 2/10 MS: 1 EraseBytes- 00:07:39.344 [2024-04-24 19:14:26.278492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002655 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.278516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.344 [2024-04-24 19:14:26.278570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000555d cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.278583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.344 [2024-04-24 19:14:26.278635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006655 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.278649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.344 [2024-04-24 19:14:26.278703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005555 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.278716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.344 #33 NEW cov: 11848 ft: 14384 corp: 28/106b lim: 10 exec/s: 33 rss: 73Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:39.344 [2024-04-24 19:14:26.318392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.318416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.344 [2024-04-24 19:14:26.318471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007099 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.318485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.344 #34 NEW cov: 11848 ft: 14397 corp: 29/111b lim: 10 exec/s: 34 rss: 73Mb L: 5/10 MS: 1 ChangeByte- 00:07:39.344 [2024-04-24 19:14:26.358516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000005d cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.358544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.344 [2024-04-24 19:14:26.358598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:00000000 00:07:39.344 [2024-04-24 19:14:26.358612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.603 #35 NEW cov: 11848 ft: 14412 corp: 30/116b lim: 10 exec/s: 35 rss: 73Mb L: 5/10 MS: 1 CrossOver- 00:07:39.603 [2024-04-24 19:14:26.399009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000890a cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.399034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.603 [2024-04-24 19:14:26.399107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004e2e cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.399122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.603 [2024-04-24 19:14:26.399177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000cb8f cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.399191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.603 [2024-04-24 19:14:26.399244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ed6f cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.399258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.603 [2024-04-24 19:14:26.399310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000900 cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.399324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.603 #36 NEW cov: 11848 ft: 14444 corp: 31/126b lim: 10 exec/s: 36 rss: 73Mb L: 10/10 MS: 1 CMP- DE: "N.\313\217\355o\011\000"- 00:07:39.603 [2024-04-24 19:14:26.438643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000000f7 cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.438667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.603 #37 NEW cov: 11848 ft: 14452 corp: 32/128b lim: 10 exec/s: 37 rss: 73Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:39.603 [2024-04-24 19:14:26.478838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.478863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.603 [2024-04-24 19:14:26.478934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.478948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.603 #38 NEW cov: 11848 ft: 14460 corp: 33/132b lim: 10 exec/s: 38 rss: 73Mb L: 4/10 MS: 1 CopyPart- 00:07:39.603 [2024-04-24 19:14:26.518932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.518956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.603 [2024-04-24 19:14:26.519012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.519025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.603 #39 NEW cov: 11857 ft: 14492 corp: 34/136b lim: 10 exec/s: 39 rss: 73Mb L: 4/10 MS: 1 CopyPart- 00:07:39.603 [2024-04-24 19:14:26.558923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005d26 cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.558947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.603 [2024-04-24 19:14:26.589032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005c26 cdw11:00000000 00:07:39.603 [2024-04-24 19:14:26.589055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.603 #41 NEW cov: 11857 ft: 14499 corp: 35/138b lim: 10 exec/s: 41 rss: 73Mb L: 2/10 MS: 2 ShuffleBytes-ChangeBinInt- 00:07:39.863 [2024-04-24 19:14:26.629456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000265d cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.629483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.629553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00005566 cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.629568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.629623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005555 cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.629636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.629688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005555 cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.629702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.863 #42 NEW cov: 11857 ft: 14508 corp: 36/147b lim: 10 exec/s: 42 rss: 73Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:39.863 [2024-04-24 19:14:26.669242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a4b cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.669266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.863 #43 NEW cov: 11857 ft: 14531 corp: 37/150b lim: 10 exec/s: 43 rss: 73Mb L: 3/10 MS: 1 CrossOver- 00:07:39.863 [2024-04-24 19:14:26.699286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a4a cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.699311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.863 #44 NEW cov: 11857 ft: 14571 corp: 38/153b lim: 10 exec/s: 44 rss: 73Mb L: 3/10 MS: 1 CopyPart- 00:07:39.863 [2024-04-24 19:14:26.739480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003d5d cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.739505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.863 #45 NEW cov: 11857 ft: 14645 corp: 39/156b lim: 10 exec/s: 45 rss: 73Mb L: 3/10 MS: 1 InsertByte- 00:07:39.863 [2024-04-24 19:14:26.780051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000890a cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.780080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.780150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004e2e cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.780164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.780220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000cb8f cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.780236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.780292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ed6f cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.780306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.780361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000094e cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.780375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.863 #46 NEW cov: 11857 ft: 14659 corp: 40/166b lim: 10 exec/s: 46 rss: 74Mb L: 10/10 MS: 1 CopyPart- 00:07:39.863 [2024-04-24 19:14:26.820076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000295d cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.820101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.820173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006655 cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.820188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.820242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005555 cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.820256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.820311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005555 cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.820324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.863 #47 NEW cov: 11857 ft: 14662 corp: 41/175b lim: 10 exec/s: 47 rss: 74Mb L: 9/10 MS: 1 ChangeByte- 00:07:39.863 [2024-04-24 19:14:26.859922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000daf4 cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.859946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.863 [2024-04-24 19:14:26.860001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000996b cdw11:00000000 00:07:39.863 [2024-04-24 19:14:26.860015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.122 #48 NEW cov: 11857 ft: 14709 corp: 42/179b lim: 10 exec/s: 48 rss: 74Mb L: 4/10 MS: 1 ChangeByte- 00:07:40.122 [2024-04-24 19:14:26.900303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000295d cdw11:00000000 00:07:40.122 [2024-04-24 19:14:26.900327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.122 [2024-04-24 19:14:26.900385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006655 cdw11:00000000 00:07:40.122 [2024-04-24 19:14:26.900399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.122 [2024-04-24 19:14:26.900454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.122 [2024-04-24 19:14:26.900467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.122 [2024-04-24 19:14:26.900520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00005555 cdw11:00000000 00:07:40.122 [2024-04-24 19:14:26.900533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.122 #49 NEW cov: 11857 ft: 14714 corp: 43/188b lim: 10 exec/s: 24 rss: 74Mb L: 9/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:07:40.122 #49 DONE cov: 11857 ft: 14714 corp: 43/188b lim: 10 exec/s: 24 rss: 74Mb 00:07:40.122 ###### Recommended dictionary. ###### 00:07:40.122 "\000\000" # Uses: 3 00:07:40.122 "N.\313\217\355o\011\000" # Uses: 0 00:07:40.122 ###### End of recommended dictionary. ###### 00:07:40.122 Done 49 runs in 2 second(s) 00:07:40.122 19:14:27 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:07:40.122 19:14:27 -- ../common.sh@72 -- # (( i++ )) 00:07:40.122 19:14:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.122 19:14:27 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:40.122 19:14:27 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:40.122 19:14:27 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.122 19:14:27 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.122 19:14:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:40.122 19:14:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:40.122 19:14:27 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:40.122 19:14:27 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:40.122 19:14:27 -- nvmf/run.sh@34 -- # printf %02d 7 00:07:40.122 19:14:27 -- nvmf/run.sh@34 -- # port=4407 00:07:40.122 19:14:27 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:40.122 19:14:27 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:40.122 19:14:27 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.122 19:14:27 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:40.122 19:14:27 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:40.122 19:14:27 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:07:40.122 [2024-04-24 19:14:27.100804] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:40.122 [2024-04-24 19:14:27.100902] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1622301 ] 00:07:40.122 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.380 [2024-04-24 19:14:27.309779] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.381 [2024-04-24 19:14:27.382363] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.639 [2024-04-24 19:14:27.441717] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.639 [2024-04-24 19:14:27.457928] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:40.639 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.639 INFO: Seed: 3965584282 00:07:40.639 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:40.639 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:40.639 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:40.639 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.639 #2 INITED exec/s: 0 rss: 64Mb 00:07:40.639 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.639 This may also happen if the target rejected all inputs we tried so far 00:07:40.639 [2024-04-24 19:14:27.535647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005151 cdw11:00000000 00:07:40.639 [2024-04-24 19:14:27.535691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.639 [2024-04-24 19:14:27.535791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005151 cdw11:00000000 00:07:40.639 [2024-04-24 19:14:27.535813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.639 [2024-04-24 19:14:27.535907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000510a cdw11:00000000 00:07:40.639 [2024-04-24 19:14:27.535922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.897 NEW_FUNC[1/669]: 0x48d200 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:40.897 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.897 #3 NEW cov: 11604 ft: 11605 corp: 2/7b lim: 10 exec/s: 0 rss: 70Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:40.897 [2024-04-24 19:14:27.865909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005151 cdw11:00000000 00:07:40.897 [2024-04-24 19:14:27.865949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.897 [2024-04-24 19:14:27.866039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000510a cdw11:00000000 00:07:40.897 [2024-04-24 19:14:27.866056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.897 #4 NEW cov: 11734 ft: 12429 corp: 3/11b lim: 10 exec/s: 0 rss: 71Mb L: 4/6 MS: 1 EraseBytes- 00:07:41.155 [2024-04-24 19:14:27.926407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005151 cdw11:00000000 00:07:41.155 [2024-04-24 19:14:27.926436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.155 [2024-04-24 19:14:27.926526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005151 cdw11:00000000 00:07:41.155 [2024-04-24 19:14:27.926541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.155 [2024-04-24 19:14:27.926623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a51 cdw11:00000000 00:07:41.155 [2024-04-24 19:14:27.926636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.155 #5 NEW cov: 11740 ft: 12627 corp: 4/17b lim: 10 exec/s: 0 rss: 71Mb L: 6/6 MS: 1 ShuffleBytes- 00:07:41.155 [2024-04-24 19:14:27.976393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004151 cdw11:00000000 00:07:41.155 [2024-04-24 19:14:27.976422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.155 [2024-04-24 19:14:27.976511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000510a cdw11:00000000 00:07:41.155 [2024-04-24 19:14:27.976527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.155 #6 NEW cov: 11825 ft: 12848 corp: 5/21b lim: 10 exec/s: 0 rss: 71Mb L: 4/6 MS: 1 ChangeBit- 00:07:41.155 [2024-04-24 19:14:28.036898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004151 cdw11:00000000 00:07:41.155 [2024-04-24 19:14:28.036926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.155 [2024-04-24 19:14:28.037014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002451 cdw11:00000000 00:07:41.155 [2024-04-24 19:14:28.037029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.155 #7 NEW cov: 11825 ft: 12899 corp: 6/26b lim: 10 exec/s: 0 rss: 71Mb L: 5/6 MS: 1 InsertByte- 00:07:41.155 [2024-04-24 19:14:28.097109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004151 cdw11:00000000 00:07:41.156 [2024-04-24 19:14:28.097139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.156 [2024-04-24 19:14:28.097228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e00a cdw11:00000000 00:07:41.156 [2024-04-24 19:14:28.097245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.156 #8 NEW cov: 11825 ft: 12945 corp: 7/30b lim: 10 exec/s: 0 rss: 72Mb L: 4/6 MS: 1 ChangeByte- 00:07:41.156 [2024-04-24 19:14:28.148234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005151 cdw11:00000000 00:07:41.156 [2024-04-24 19:14:28.148261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.156 [2024-04-24 19:14:28.148344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000051ff cdw11:00000000 00:07:41.156 [2024-04-24 19:14:28.148359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.156 [2024-04-24 19:14:28.148458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.156 [2024-04-24 19:14:28.148474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.156 [2024-04-24 19:14:28.148557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff51 cdw11:00000000 00:07:41.156 [2024-04-24 19:14:28.148571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.156 [2024-04-24 19:14:28.148650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000510a cdw11:00000000 00:07:41.156 [2024-04-24 19:14:28.148664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.156 #9 NEW cov: 11825 ft: 13235 corp: 8/40b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:41.414 [2024-04-24 19:14:28.197388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000106b cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.197414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.414 #13 NEW cov: 11825 ft: 13530 corp: 9/42b lim: 10 exec/s: 0 rss: 72Mb L: 2/10 MS: 4 CrossOver-CopyPart-ChangeByte-InsertByte- 00:07:41.414 [2024-04-24 19:14:28.247669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004242 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.247694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.414 #16 NEW cov: 11825 ft: 13561 corp: 10/44b lim: 10 exec/s: 0 rss: 72Mb L: 2/10 MS: 3 ChangeBit-ChangeBit-CopyPart- 00:07:41.414 [2024-04-24 19:14:28.298673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.298700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.298777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.298792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.298879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000229 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.298895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.298979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004573 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.298994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.414 #17 NEW cov: 11825 ft: 13605 corp: 11/53b lim: 10 exec/s: 0 rss: 72Mb L: 9/10 MS: 1 CMP- DE: "\000\000\000\000\002)Es"- 00:07:41.414 [2024-04-24 19:14:28.349099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000051ff cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.349124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.349201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff51 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.349215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.349302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff51 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.349316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.349402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff51 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.349417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.349499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000510a cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.349515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.414 #18 NEW cov: 11825 ft: 13626 corp: 12/63b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:41.414 [2024-04-24 19:14:28.409158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000029 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.409185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.409274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.409291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.409384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004500 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.409412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.414 [2024-04-24 19:14:28.409496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000273 cdw11:00000000 00:07:41.414 [2024-04-24 19:14:28.409513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.672 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.672 #19 NEW cov: 11848 ft: 13710 corp: 13/72b lim: 10 exec/s: 0 rss: 72Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:41.672 [2024-04-24 19:14:28.469706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002c00 cdw11:00000000 00:07:41.672 [2024-04-24 19:14:28.469733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.672 [2024-04-24 19:14:28.469828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.672 [2024-04-24 19:14:28.469847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.672 [2024-04-24 19:14:28.469928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 00:07:41.672 [2024-04-24 19:14:28.469945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.672 [2024-04-24 19:14:28.470040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002945 cdw11:00000000 00:07:41.672 [2024-04-24 19:14:28.470057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.672 [2024-04-24 19:14:28.470140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000730a cdw11:00000000 00:07:41.672 [2024-04-24 19:14:28.470157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.672 #20 NEW cov: 11848 ft: 13807 corp: 14/82b lim: 10 exec/s: 0 rss: 72Mb L: 10/10 MS: 1 InsertByte- 00:07:41.672 [2024-04-24 19:14:28.519889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005151 cdw11:00000000 00:07:41.672 [2024-04-24 19:14:28.519917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.520006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000051ff cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.520022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.520109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.520124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.520211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff42 cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.520227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.520312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000420a cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.520328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.673 #21 NEW cov: 11848 ft: 13809 corp: 15/92b lim: 10 exec/s: 21 rss: 72Mb L: 10/10 MS: 1 CrossOver- 00:07:41.673 [2024-04-24 19:14:28.569013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005141 cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.569040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.673 #22 NEW cov: 11848 ft: 13846 corp: 16/94b lim: 10 exec/s: 22 rss: 72Mb L: 2/10 MS: 1 CrossOver- 00:07:41.673 [2024-04-24 19:14:28.620069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000419a cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.620096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.620188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009a9a cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.620205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.620288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009a51 cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.620305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.620384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000e00a cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.620400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.673 #23 NEW cov: 11848 ft: 13857 corp: 17/102b lim: 10 exec/s: 23 rss: 72Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:41.673 [2024-04-24 19:14:28.679978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004151 cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.680005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.680105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009a9a cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.680122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.673 [2024-04-24 19:14:28.680207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009ae0 cdw11:00000000 00:07:41.673 [2024-04-24 19:14:28.680223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.932 #24 NEW cov: 11848 ft: 13869 corp: 18/109b lim: 10 exec/s: 24 rss: 72Mb L: 7/10 MS: 1 CrossOver- 00:07:41.932 [2024-04-24 19:14:28.730230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a51 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.730257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.730355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005151 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.730372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.730456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000510a cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.730472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.932 #25 NEW cov: 11848 ft: 13893 corp: 19/115b lim: 10 exec/s: 25 rss: 72Mb L: 6/10 MS: 1 ChangeByte- 00:07:41.932 [2024-04-24 19:14:28.780522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.780548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.780633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff41 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.780649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.780738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005151 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.780754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.932 #26 NEW cov: 11848 ft: 13930 corp: 20/122b lim: 10 exec/s: 26 rss: 72Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:07:41.932 [2024-04-24 19:14:28.831466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005141 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.831492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.831586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.831603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.831697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.831712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.831793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000229 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.831809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.831894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00004573 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.831910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:41.932 #27 NEW cov: 11848 ft: 13955 corp: 21/132b lim: 10 exec/s: 27 rss: 72Mb L: 10/10 MS: 1 PersAutoDict- DE: "\000\000\000\000\002)Es"- 00:07:41.932 [2024-04-24 19:14:28.891659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004151 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.891685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.891769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009a9a cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.891784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.891864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000519a cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.891878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.932 [2024-04-24 19:14:28.891958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00005151 cdw11:00000000 00:07:41.932 [2024-04-24 19:14:28.891974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.932 #28 NEW cov: 11848 ft: 13972 corp: 22/140b lim: 10 exec/s: 28 rss: 72Mb L: 8/10 MS: 1 CrossOver- 00:07:42.191 [2024-04-24 19:14:28.951221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005151 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:28.951248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.191 #29 NEW cov: 11848 ft: 14012 corp: 23/142b lim: 10 exec/s: 29 rss: 73Mb L: 2/10 MS: 1 CopyPart- 00:07:42.191 [2024-04-24 19:14:29.002387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004151 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.002413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.002495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009aa2 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.002511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.002597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000519a cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.002612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.002688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00005151 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.002704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.191 #30 NEW cov: 11848 ft: 14024 corp: 24/150b lim: 10 exec/s: 30 rss: 73Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:42.191 [2024-04-24 19:14:29.062796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c665 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.062822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.062902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006565 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.062918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.062999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000065ae cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.063013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.063105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001ff5 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.063119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.191 #31 NEW cov: 11848 ft: 14091 corp: 25/158b lim: 10 exec/s: 31 rss: 73Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:42.191 [2024-04-24 19:14:29.123359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a51 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.123386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.123461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005151 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.123477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.123560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002f2f cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.123575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.123660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002f2f cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.123676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.123766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000510a cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.123782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.191 #32 NEW cov: 11848 ft: 14122 corp: 26/168b lim: 10 exec/s: 32 rss: 73Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:42.191 [2024-04-24 19:14:29.183218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004151 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.183246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.183341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008a9a cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.183356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.191 [2024-04-24 19:14:29.183448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00009ae0 cdw11:00000000 00:07:42.191 [2024-04-24 19:14:29.183465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.191 #33 NEW cov: 11848 ft: 14159 corp: 27/175b lim: 10 exec/s: 33 rss: 73Mb L: 7/10 MS: 1 ChangeBit- 00:07:42.450 [2024-04-24 19:14:29.232797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004232 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.232824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.450 #34 NEW cov: 11848 ft: 14180 corp: 28/177b lim: 10 exec/s: 34 rss: 73Mb L: 2/10 MS: 1 ChangeByte- 00:07:42.450 [2024-04-24 19:14:29.283213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000942 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.283238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.450 #36 NEW cov: 11848 ft: 14205 corp: 29/179b lim: 10 exec/s: 36 rss: 73Mb L: 2/10 MS: 2 EraseBytes-InsertByte- 00:07:42.450 [2024-04-24 19:14:29.333330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00005171 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.333356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.450 #37 NEW cov: 11848 ft: 14217 corp: 30/181b lim: 10 exec/s: 37 rss: 73Mb L: 2/10 MS: 1 ChangeBit- 00:07:42.450 [2024-04-24 19:14:29.384550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000c667 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.384577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.450 [2024-04-24 19:14:29.384658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006565 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.384674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.450 [2024-04-24 19:14:29.384762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000065ae cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.384776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.450 [2024-04-24 19:14:29.384862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001ff5 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.384878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.450 #38 NEW cov: 11848 ft: 14234 corp: 31/189b lim: 10 exec/s: 38 rss: 73Mb L: 8/10 MS: 1 ChangeBit- 00:07:42.450 [2024-04-24 19:14:29.444998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000051ff cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.445023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.450 [2024-04-24 19:14:29.445116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ff51 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.445133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.450 [2024-04-24 19:14:29.445219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff51 cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.445236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.450 [2024-04-24 19:14:29.445326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff4c cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.445340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.450 [2024-04-24 19:14:29.445428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000510a cdw11:00000000 00:07:42.450 [2024-04-24 19:14:29.445443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:42.708 #39 NEW cov: 11848 ft: 14259 corp: 32/199b lim: 10 exec/s: 39 rss: 73Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:42.708 [2024-04-24 19:14:29.504096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00009b0b cdw11:00000000 00:07:42.708 [2024-04-24 19:14:29.504121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.708 #43 NEW cov: 11848 ft: 14294 corp: 33/201b lim: 10 exec/s: 21 rss: 73Mb L: 2/10 MS: 4 CopyPart-ShuffleBytes-ChangeByte-InsertByte- 00:07:42.708 #43 DONE cov: 11848 ft: 14294 corp: 33/201b lim: 10 exec/s: 21 rss: 73Mb 00:07:42.708 ###### Recommended dictionary. ###### 00:07:42.708 "\000\000\000\000\002)Es" # Uses: 1 00:07:42.708 ###### End of recommended dictionary. ###### 00:07:42.708 Done 43 runs in 2 second(s) 00:07:42.708 19:14:29 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:07:42.708 19:14:29 -- ../common.sh@72 -- # (( i++ )) 00:07:42.708 19:14:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.708 19:14:29 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:42.708 19:14:29 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:42.708 19:14:29 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.708 19:14:29 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.708 19:14:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:42.708 19:14:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:42.708 19:14:29 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:42.708 19:14:29 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:42.708 19:14:29 -- nvmf/run.sh@34 -- # printf %02d 8 00:07:42.708 19:14:29 -- nvmf/run.sh@34 -- # port=4408 00:07:42.708 19:14:29 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:42.708 19:14:29 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:42.708 19:14:29 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.708 19:14:29 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:42.708 19:14:29 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:42.708 19:14:29 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:07:42.708 [2024-04-24 19:14:29.688218] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:42.708 [2024-04-24 19:14:29.688289] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1622644 ] 00:07:42.708 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.967 [2024-04-24 19:14:29.894282] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.967 [2024-04-24 19:14:29.967968] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.224 [2024-04-24 19:14:30.027579] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.224 [2024-04-24 19:14:30.043791] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:43.224 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.224 INFO: Seed: 2256628749 00:07:43.224 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:43.224 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:43.224 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:43.224 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.224 [2024-04-24 19:14:30.109149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.224 [2024-04-24 19:14:30.109192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.224 #2 INITED cov: 11632 ft: 11633 corp: 1/1b exec/s: 0 rss: 70Mb 00:07:43.224 [2024-04-24 19:14:30.149125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.224 [2024-04-24 19:14:30.149150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.224 #3 NEW cov: 11762 ft: 12193 corp: 2/2b lim: 5 exec/s: 0 rss: 70Mb L: 1/1 MS: 1 CrossOver- 00:07:43.224 [2024-04-24 19:14:30.199724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.224 [2024-04-24 19:14:30.199750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.224 [2024-04-24 19:14:30.199805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.224 [2024-04-24 19:14:30.199819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.224 [2024-04-24 19:14:30.199874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.224 [2024-04-24 19:14:30.199888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.224 [2024-04-24 19:14:30.199941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.224 [2024-04-24 19:14:30.199954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.224 #4 NEW cov: 11768 ft: 13186 corp: 3/6b lim: 5 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:43.481 [2024-04-24 19:14:30.249411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.481 [2024-04-24 19:14:30.249437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.481 #5 NEW cov: 11853 ft: 13429 corp: 4/7b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ChangeByte- 00:07:43.481 [2024-04-24 19:14:30.289517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.481 [2024-04-24 19:14:30.289542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.481 #6 NEW cov: 11853 ft: 13578 corp: 5/8b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:43.482 [2024-04-24 19:14:30.329631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.329656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.482 #7 NEW cov: 11853 ft: 13659 corp: 6/9b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ChangeBit- 00:07:43.482 [2024-04-24 19:14:30.369744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.369770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.482 #8 NEW cov: 11853 ft: 13737 corp: 7/10b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 CopyPart- 00:07:43.482 [2024-04-24 19:14:30.410485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.410522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.482 [2024-04-24 19:14:30.410577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.410591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.482 [2024-04-24 19:14:30.410661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.410674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.482 [2024-04-24 19:14:30.410727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.410741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.482 [2024-04-24 19:14:30.410796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.410809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.482 #9 NEW cov: 11853 ft: 13820 corp: 8/15b lim: 5 exec/s: 0 rss: 70Mb L: 5/5 MS: 1 InsertByte- 00:07:43.482 [2024-04-24 19:14:30.459988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.482 [2024-04-24 19:14:30.460013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.482 #10 NEW cov: 11853 ft: 13845 corp: 9/16b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:43.739 [2024-04-24 19:14:30.500154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.500180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.739 #11 NEW cov: 11853 ft: 13935 corp: 10/17b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 CopyPart- 00:07:43.739 [2024-04-24 19:14:30.540422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.540447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.739 [2024-04-24 19:14:30.540502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.540516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.739 #12 NEW cov: 11853 ft: 14123 corp: 11/19b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 InsertByte- 00:07:43.739 [2024-04-24 19:14:30.590419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.590443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.739 #13 NEW cov: 11853 ft: 14133 corp: 12/20b lim: 5 exec/s: 0 rss: 71Mb L: 1/5 MS: 1 ChangeBit- 00:07:43.739 [2024-04-24 19:14:30.630678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.630702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.739 [2024-04-24 19:14:30.630758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.630772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.739 #14 NEW cov: 11853 ft: 14163 corp: 13/22b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 InsertByte- 00:07:43.739 [2024-04-24 19:14:30.671243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.671267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.739 [2024-04-24 19:14:30.671338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.671352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.739 [2024-04-24 19:14:30.671405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.671419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.739 [2024-04-24 19:14:30.671489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.671503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.739 [2024-04-24 19:14:30.671558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.671571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.739 #15 NEW cov: 11853 ft: 14201 corp: 14/27b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeByte- 00:07:43.739 [2024-04-24 19:14:30.720926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.720950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.739 [2024-04-24 19:14:30.721002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.739 [2024-04-24 19:14:30.721016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.739 #16 NEW cov: 11853 ft: 14223 corp: 15/29b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 InsertByte- 00:07:43.997 [2024-04-24 19:14:30.761505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-04-24 19:14:30.761530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.997 [2024-04-24 19:14:30.761586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-04-24 19:14:30.761600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.997 [2024-04-24 19:14:30.761653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-04-24 19:14:30.761669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.997 [2024-04-24 19:14:30.761721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-04-24 19:14:30.761735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.997 [2024-04-24 19:14:30.761790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-04-24 19:14:30.761804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.997 #17 NEW cov: 11853 ft: 14233 corp: 16/34b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 ChangeBit- 00:07:43.997 [2024-04-24 19:14:30.801289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-04-24 19:14:30.801314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.997 [2024-04-24 19:14:30.801368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-04-24 19:14:30.801383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.997 [2024-04-24 19:14:30.801438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.801452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.998 #18 NEW cov: 11853 ft: 14389 corp: 17/37b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 CopyPart- 00:07:43.998 [2024-04-24 19:14:30.851595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.851620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.851674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.851688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.851740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.851753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.851807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.851820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.998 #19 NEW cov: 11853 ft: 14428 corp: 18/41b lim: 5 exec/s: 0 rss: 72Mb L: 4/5 MS: 1 CrossOver- 00:07:43.998 [2024-04-24 19:14:30.901267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.901291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.998 #20 NEW cov: 11853 ft: 14443 corp: 19/42b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ChangeByte- 00:07:43.998 [2024-04-24 19:14:30.931982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.932009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.932087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.932112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.932182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.932195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.932249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.932262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.932318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.932331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.998 #21 NEW cov: 11853 ft: 14464 corp: 20/47b lim: 5 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 CopyPart- 00:07:43.998 [2024-04-24 19:14:30.981959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.981984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.982039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.982053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.982109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.982122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.998 [2024-04-24 19:14:30.982177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-04-24 19:14:30.982190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.564 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.564 #22 NEW cov: 11876 ft: 14503 corp: 21/51b lim: 5 exec/s: 22 rss: 73Mb L: 4/5 MS: 1 InsertByte- 00:07:44.564 [2024-04-24 19:14:31.322521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.322568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.564 #23 NEW cov: 11876 ft: 14573 corp: 22/52b lim: 5 exec/s: 23 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:07:44.564 [2024-04-24 19:14:31.372562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.372591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.564 #24 NEW cov: 11876 ft: 14592 corp: 23/53b lim: 5 exec/s: 24 rss: 73Mb L: 1/5 MS: 1 ChangeBit- 00:07:44.564 [2024-04-24 19:14:31.412644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.412669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.564 #25 NEW cov: 11876 ft: 14617 corp: 24/54b lim: 5 exec/s: 25 rss: 73Mb L: 1/5 MS: 1 EraseBytes- 00:07:44.564 [2024-04-24 19:14:31.452771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.452797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.564 #26 NEW cov: 11876 ft: 14622 corp: 25/55b lim: 5 exec/s: 26 rss: 73Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:44.564 [2024-04-24 19:14:31.492860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.492884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.564 #27 NEW cov: 11876 ft: 14647 corp: 26/56b lim: 5 exec/s: 27 rss: 73Mb L: 1/5 MS: 1 CrossOver- 00:07:44.564 [2024-04-24 19:14:31.533656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.533681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.533740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.533754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.533812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.533826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.533885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.533900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.533960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.533974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.564 #28 NEW cov: 11876 ft: 14657 corp: 27/61b lim: 5 exec/s: 28 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:07:44.564 [2024-04-24 19:14:31.573797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.573822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.573881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.573895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.573956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.573985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.574040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.574053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.564 [2024-04-24 19:14:31.574117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.564 [2024-04-24 19:14:31.574131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.821 #29 NEW cov: 11876 ft: 14663 corp: 28/66b lim: 5 exec/s: 29 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:07:44.821 [2024-04-24 19:14:31.623892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.623918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.623978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.623992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.624048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.624066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.624120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.624134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.624190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.624205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.821 #30 NEW cov: 11876 ft: 14695 corp: 29/71b lim: 5 exec/s: 30 rss: 73Mb L: 5/5 MS: 1 CopyPart- 00:07:44.821 [2024-04-24 19:14:31.663573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.663598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.663657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.663672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 #31 NEW cov: 11876 ft: 14701 corp: 30/73b lim: 5 exec/s: 31 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:07:44.821 [2024-04-24 19:14:31.703669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.703697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.703756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.703770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 #32 NEW cov: 11876 ft: 14733 corp: 31/75b lim: 5 exec/s: 32 rss: 73Mb L: 2/5 MS: 1 ChangeByte- 00:07:44.821 [2024-04-24 19:14:31.743619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.743644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 #33 NEW cov: 11876 ft: 14736 corp: 32/76b lim: 5 exec/s: 33 rss: 73Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:44.821 [2024-04-24 19:14:31.773738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.773762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 #34 NEW cov: 11876 ft: 14749 corp: 33/77b lim: 5 exec/s: 34 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:07:44.821 [2024-04-24 19:14:31.814505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.814529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.814584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.814599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.814655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.814669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.821 [2024-04-24 19:14:31.814726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.821 [2024-04-24 19:14:31.814740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.822 [2024-04-24 19:14:31.814795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.822 [2024-04-24 19:14:31.814812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.822 #35 NEW cov: 11876 ft: 14777 corp: 34/82b lim: 5 exec/s: 35 rss: 74Mb L: 5/5 MS: 1 ChangeBit- 00:07:45.079 [2024-04-24 19:14:31.854136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:31.854160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:31.854216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:31.854231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.079 #36 NEW cov: 11876 ft: 14843 corp: 35/84b lim: 5 exec/s: 36 rss: 74Mb L: 2/5 MS: 1 CrossOver- 00:07:45.079 [2024-04-24 19:14:31.904094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:31.904119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.079 #37 NEW cov: 11876 ft: 14873 corp: 36/85b lim: 5 exec/s: 37 rss: 74Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:45.079 [2024-04-24 19:14:31.944382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:31.944406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:31.944464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:31.944478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.079 #38 NEW cov: 11876 ft: 14888 corp: 37/87b lim: 5 exec/s: 38 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:07:45.079 [2024-04-24 19:14:31.984333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:31.984357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.079 #39 NEW cov: 11876 ft: 14922 corp: 38/88b lim: 5 exec/s: 39 rss: 74Mb L: 1/5 MS: 1 ChangeByte- 00:07:45.079 [2024-04-24 19:14:32.025087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.025111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:32.025168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.025182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:32.025236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.025249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:32.025304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.025318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:32.025372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.025385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:45.079 #40 NEW cov: 11876 ft: 14947 corp: 39/93b lim: 5 exec/s: 40 rss: 74Mb L: 5/5 MS: 1 CrossOver- 00:07:45.079 [2024-04-24 19:14:32.065043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.065072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:32.065140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.065158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:32.065211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.065225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.079 [2024-04-24 19:14:32.065281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.079 [2024-04-24 19:14:32.065294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.079 #41 NEW cov: 11876 ft: 14965 corp: 40/97b lim: 5 exec/s: 41 rss: 74Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:45.336 [2024-04-24 19:14:32.104793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.336 [2024-04-24 19:14:32.104818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.336 [2024-04-24 19:14:32.104874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.336 [2024-04-24 19:14:32.104888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.336 #42 NEW cov: 11876 ft: 14996 corp: 41/99b lim: 5 exec/s: 21 rss: 74Mb L: 2/5 MS: 1 InsertByte- 00:07:45.336 #42 DONE cov: 11876 ft: 14996 corp: 41/99b lim: 5 exec/s: 21 rss: 74Mb 00:07:45.336 Done 42 runs in 2 second(s) 00:07:45.336 19:14:32 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:07:45.336 19:14:32 -- ../common.sh@72 -- # (( i++ )) 00:07:45.336 19:14:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.336 19:14:32 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:45.336 19:14:32 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:45.336 19:14:32 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.336 19:14:32 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.336 19:14:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:45.336 19:14:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:45.336 19:14:32 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:45.336 19:14:32 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:45.336 19:14:32 -- nvmf/run.sh@34 -- # printf %02d 9 00:07:45.336 19:14:32 -- nvmf/run.sh@34 -- # port=4409 00:07:45.336 19:14:32 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:45.336 19:14:32 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:45.336 19:14:32 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.337 19:14:32 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:45.337 19:14:32 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:45.337 19:14:32 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:07:45.337 [2024-04-24 19:14:32.299509] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:45.337 [2024-04-24 19:14:32.299610] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623003 ] 00:07:45.337 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.595 [2024-04-24 19:14:32.506569] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.595 [2024-04-24 19:14:32.580620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.853 [2024-04-24 19:14:32.640166] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.853 [2024-04-24 19:14:32.656370] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:45.853 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.853 INFO: Seed: 574670431 00:07:45.853 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:45.853 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:45.853 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:45.853 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.853 [2024-04-24 19:14:32.733494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.853 [2024-04-24 19:14:32.733540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.853 #2 INITED cov: 11632 ft: 11633 corp: 1/1b exec/s: 0 rss: 70Mb 00:07:45.853 [2024-04-24 19:14:32.784415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.853 [2024-04-24 19:14:32.784445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.853 [2024-04-24 19:14:32.784546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.853 [2024-04-24 19:14:32.784564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.853 [2024-04-24 19:14:32.784678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.853 [2024-04-24 19:14:32.784697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.853 [2024-04-24 19:14:32.784789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.853 [2024-04-24 19:14:32.784806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.853 #3 NEW cov: 11762 ft: 12811 corp: 2/5b lim: 5 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:45.853 [2024-04-24 19:14:32.843380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.853 [2024-04-24 19:14:32.843406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.853 #4 NEW cov: 11768 ft: 12943 corp: 3/6b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ChangeByte- 00:07:46.111 [2024-04-24 19:14:32.893990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:32.894018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.111 [2024-04-24 19:14:32.894112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:32.894130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.111 #5 NEW cov: 11853 ft: 13452 corp: 4/8b lim: 5 exec/s: 0 rss: 70Mb L: 2/4 MS: 1 CrossOver- 00:07:46.111 [2024-04-24 19:14:32.943785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:32.943812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.111 #6 NEW cov: 11853 ft: 13579 corp: 5/9b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ChangeByte- 00:07:46.111 [2024-04-24 19:14:33.004002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:33.004029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.111 #7 NEW cov: 11853 ft: 13672 corp: 6/10b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ChangeByte- 00:07:46.111 [2024-04-24 19:14:33.054277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:33.054304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.111 #8 NEW cov: 11853 ft: 13745 corp: 7/11b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 ChangeByte- 00:07:46.111 [2024-04-24 19:14:33.115477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:33.115502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.111 [2024-04-24 19:14:33.115590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:33.115606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.111 [2024-04-24 19:14:33.115687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:33.115703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.111 [2024-04-24 19:14:33.115783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.111 [2024-04-24 19:14:33.115799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.369 #9 NEW cov: 11853 ft: 13879 corp: 8/15b lim: 5 exec/s: 0 rss: 70Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:46.369 [2024-04-24 19:14:33.174578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.174604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.369 #10 NEW cov: 11853 ft: 13908 corp: 9/16b lim: 5 exec/s: 0 rss: 70Mb L: 1/4 MS: 1 CopyPart- 00:07:46.369 [2024-04-24 19:14:33.225823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.225848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.369 [2024-04-24 19:14:33.225934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.225950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.369 [2024-04-24 19:14:33.226033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.226051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.369 [2024-04-24 19:14:33.226135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.226152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.369 #11 NEW cov: 11853 ft: 13976 corp: 10/20b lim: 5 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 ShuffleBytes- 00:07:46.369 [2024-04-24 19:14:33.285706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.285731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.369 [2024-04-24 19:14:33.285815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.285831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.369 [2024-04-24 19:14:33.285923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.285938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.369 #12 NEW cov: 11853 ft: 14185 corp: 11/23b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 EraseBytes- 00:07:46.369 [2024-04-24 19:14:33.345120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.369 [2024-04-24 19:14:33.345145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.369 #13 NEW cov: 11853 ft: 14196 corp: 12/24b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 CopyPart- 00:07:46.627 [2024-04-24 19:14:33.395414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.627 [2024-04-24 19:14:33.395441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.627 #14 NEW cov: 11853 ft: 14218 corp: 13/25b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 ChangeBinInt- 00:07:46.627 [2024-04-24 19:14:33.445643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.627 [2024-04-24 19:14:33.445671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.627 #15 NEW cov: 11853 ft: 14233 corp: 14/26b lim: 5 exec/s: 0 rss: 71Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:46.627 [2024-04-24 19:14:33.506579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.627 [2024-04-24 19:14:33.506605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.627 [2024-04-24 19:14:33.506699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.627 [2024-04-24 19:14:33.506716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.627 [2024-04-24 19:14:33.506803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.627 [2024-04-24 19:14:33.506822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.627 #16 NEW cov: 11853 ft: 14254 corp: 15/29b lim: 5 exec/s: 0 rss: 71Mb L: 3/4 MS: 1 ShuffleBytes- 00:07:46.627 [2024-04-24 19:14:33.566393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.627 [2024-04-24 19:14:33.566417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.628 [2024-04-24 19:14:33.566517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.628 [2024-04-24 19:14:33.566533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.886 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.886 #17 NEW cov: 11876 ft: 14280 corp: 16/31b lim: 5 exec/s: 17 rss: 72Mb L: 2/4 MS: 1 InsertByte- 00:07:46.886 [2024-04-24 19:14:33.897325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.886 [2024-04-24 19:14:33.897367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.144 #18 NEW cov: 11876 ft: 14303 corp: 17/32b lim: 5 exec/s: 18 rss: 72Mb L: 1/4 MS: 1 CrossOver- 00:07:47.144 [2024-04-24 19:14:33.958301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:33.958332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.144 [2024-04-24 19:14:33.958421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:33.958440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.144 [2024-04-24 19:14:33.958538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:33.958555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.144 #19 NEW cov: 11876 ft: 14322 corp: 18/35b lim: 5 exec/s: 19 rss: 72Mb L: 3/4 MS: 1 ShuffleBytes- 00:07:47.144 [2024-04-24 19:14:34.007881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:34.007908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.144 #20 NEW cov: 11876 ft: 14340 corp: 19/36b lim: 5 exec/s: 20 rss: 72Mb L: 1/4 MS: 1 ChangeByte- 00:07:47.144 [2024-04-24 19:14:34.068634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:34.068660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.144 [2024-04-24 19:14:34.068747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:34.068764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.144 #21 NEW cov: 11876 ft: 14389 corp: 20/38b lim: 5 exec/s: 21 rss: 72Mb L: 2/4 MS: 1 InsertByte- 00:07:47.144 [2024-04-24 19:14:34.119581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:34.119613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.144 [2024-04-24 19:14:34.119702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:34.119719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.144 [2024-04-24 19:14:34.119816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:34.119833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.144 [2024-04-24 19:14:34.119923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.144 [2024-04-24 19:14:34.119940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.144 #22 NEW cov: 11876 ft: 14398 corp: 21/42b lim: 5 exec/s: 22 rss: 73Mb L: 4/4 MS: 1 InsertByte- 00:07:47.402 [2024-04-24 19:14:34.180044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.402 [2024-04-24 19:14:34.180075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.402 [2024-04-24 19:14:34.180175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.402 [2024-04-24 19:14:34.180193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.402 [2024-04-24 19:14:34.180288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.402 [2024-04-24 19:14:34.180305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.402 [2024-04-24 19:14:34.180404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.402 [2024-04-24 19:14:34.180421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.402 [2024-04-24 19:14:34.180520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.402 [2024-04-24 19:14:34.180538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.402 #23 NEW cov: 11876 ft: 14448 corp: 22/47b lim: 5 exec/s: 23 rss: 73Mb L: 5/5 MS: 1 CrossOver- 00:07:47.403 [2024-04-24 19:14:34.240353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.240379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.403 [2024-04-24 19:14:34.240474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.240493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.403 [2024-04-24 19:14:34.240588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.240607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.403 [2024-04-24 19:14:34.240687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.240702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.403 [2024-04-24 19:14:34.240788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.240804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.403 #24 NEW cov: 11876 ft: 14468 corp: 23/52b lim: 5 exec/s: 24 rss: 73Mb L: 5/5 MS: 1 InsertByte- 00:07:47.403 [2024-04-24 19:14:34.299212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.299238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.403 #25 NEW cov: 11876 ft: 14491 corp: 24/53b lim: 5 exec/s: 25 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:07:47.403 [2024-04-24 19:14:34.349517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.349543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.403 #26 NEW cov: 11876 ft: 14518 corp: 25/54b lim: 5 exec/s: 26 rss: 73Mb L: 1/5 MS: 1 ChangeBit- 00:07:47.403 [2024-04-24 19:14:34.400758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.400784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.403 [2024-04-24 19:14:34.400889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.400905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.403 [2024-04-24 19:14:34.400996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.401013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.403 [2024-04-24 19:14:34.401111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.403 [2024-04-24 19:14:34.401129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.661 #27 NEW cov: 11876 ft: 14539 corp: 26/58b lim: 5 exec/s: 27 rss: 73Mb L: 4/5 MS: 1 CopyPart- 00:07:47.661 [2024-04-24 19:14:34.451294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.451320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.451419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.451435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.451525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.451542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.451627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.451644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.451729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.451745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.661 #28 NEW cov: 11876 ft: 14549 corp: 27/63b lim: 5 exec/s: 28 rss: 73Mb L: 5/5 MS: 1 CMP- DE: "\001\000\000n"- 00:07:47.661 [2024-04-24 19:14:34.501808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.501835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.501919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.501935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.502026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.502043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.502136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.502153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.502237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.502255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.661 #29 NEW cov: 11876 ft: 14568 corp: 28/68b lim: 5 exec/s: 29 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:47.661 [2024-04-24 19:14:34.551300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.551326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.551416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.551435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.551530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.551546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.661 #30 NEW cov: 11876 ft: 14602 corp: 29/71b lim: 5 exec/s: 30 rss: 73Mb L: 3/5 MS: 1 CrossOver- 00:07:47.661 [2024-04-24 19:14:34.600963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.600989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.661 #31 NEW cov: 11876 ft: 14652 corp: 30/72b lim: 5 exec/s: 31 rss: 73Mb L: 1/5 MS: 1 ChangeBit- 00:07:47.661 [2024-04-24 19:14:34.651428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.651453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.661 [2024-04-24 19:14:34.651540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.661 [2024-04-24 19:14:34.651556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.919 #32 NEW cov: 11876 ft: 14671 corp: 31/74b lim: 5 exec/s: 32 rss: 73Mb L: 2/5 MS: 1 ChangeByte- 00:07:47.919 [2024-04-24 19:14:34.712724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.919 [2024-04-24 19:14:34.712760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.919 [2024-04-24 19:14:34.712850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.919 [2024-04-24 19:14:34.712868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.919 [2024-04-24 19:14:34.712963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.919 [2024-04-24 19:14:34.712980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.919 [2024-04-24 19:14:34.713075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.919 [2024-04-24 19:14:34.713091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.919 [2024-04-24 19:14:34.713180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.919 [2024-04-24 19:14:34.713197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.919 #33 NEW cov: 11876 ft: 14686 corp: 32/79b lim: 5 exec/s: 16 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:47.919 #33 DONE cov: 11876 ft: 14686 corp: 32/79b lim: 5 exec/s: 16 rss: 73Mb 00:07:47.919 ###### Recommended dictionary. ###### 00:07:47.919 "\001\000\000n" # Uses: 0 00:07:47.919 ###### End of recommended dictionary. ###### 00:07:47.919 Done 33 runs in 2 second(s) 00:07:47.919 19:14:34 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:07:47.919 19:14:34 -- ../common.sh@72 -- # (( i++ )) 00:07:47.919 19:14:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.919 19:14:34 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:47.919 19:14:34 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:47.919 19:14:34 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.919 19:14:34 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.919 19:14:34 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:47.919 19:14:34 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:47.919 19:14:34 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:47.919 19:14:34 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:47.919 19:14:34 -- nvmf/run.sh@34 -- # printf %02d 10 00:07:47.919 19:14:34 -- nvmf/run.sh@34 -- # port=4410 00:07:47.919 19:14:34 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:47.919 19:14:34 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:47.919 19:14:34 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.919 19:14:34 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:47.919 19:14:34 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:47.919 19:14:34 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:07:47.919 [2024-04-24 19:14:34.883700] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:47.919 [2024-04-24 19:14:34.883759] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623359 ] 00:07:47.919 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.177 [2024-04-24 19:14:35.078997] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.177 [2024-04-24 19:14:35.154759] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.435 [2024-04-24 19:14:35.214762] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.435 [2024-04-24 19:14:35.230956] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:48.435 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.435 INFO: Seed: 3147666634 00:07:48.435 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:48.435 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:48.435 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:48.435 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.435 #2 INITED exec/s: 0 rss: 64Mb 00:07:48.435 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.435 This may also happen if the target rejected all inputs we tried so far 00:07:48.435 [2024-04-24 19:14:35.296654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.435 [2024-04-24 19:14:35.296683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.435 [2024-04-24 19:14:35.296741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.435 [2024-04-24 19:14:35.296756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.435 [2024-04-24 19:14:35.296812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.435 [2024-04-24 19:14:35.296826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.435 [2024-04-24 19:14:35.296885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.435 [2024-04-24 19:14:35.296898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.694 NEW_FUNC[1/670]: 0x48eb70 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:48.694 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.694 #7 NEW cov: 11652 ft: 11656 corp: 2/35b lim: 40 exec/s: 0 rss: 71Mb L: 34/34 MS: 5 CrossOver-ChangeBit-ChangeBit-CMP-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:07:48.694 [2024-04-24 19:14:35.637346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7647f cdw11:f26f0900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.694 [2024-04-24 19:14:35.637410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.694 #11 NEW cov: 11785 ft: 12882 corp: 3/46b lim: 40 exec/s: 0 rss: 71Mb L: 11/34 MS: 4 ChangeBit-InsertByte-InsertByte-CMP- DE: "\300\307d\177\362o\011\000"- 00:07:48.694 [2024-04-24 19:14:35.687476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.694 [2024-04-24 19:14:35.687502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.694 [2024-04-24 19:14:35.687562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.694 [2024-04-24 19:14:35.687576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.694 [2024-04-24 19:14:35.687636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.694 [2024-04-24 19:14:35.687650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.952 #12 NEW cov: 11791 ft: 13344 corp: 4/75b lim: 40 exec/s: 0 rss: 72Mb L: 29/34 MS: 1 EraseBytes- 00:07:48.952 [2024-04-24 19:14:35.737642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.737668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.952 [2024-04-24 19:14:35.737725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.737738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.952 [2024-04-24 19:14:35.737795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.737809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.952 #13 NEW cov: 11876 ft: 13658 corp: 5/104b lim: 40 exec/s: 0 rss: 72Mb L: 29/34 MS: 1 EraseBytes- 00:07:48.952 [2024-04-24 19:14:35.777853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.777878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.952 [2024-04-24 19:14:35.777938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.777952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.952 [2024-04-24 19:14:35.778006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.778023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.952 [2024-04-24 19:14:35.778086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.778100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.952 #18 NEW cov: 11876 ft: 13728 corp: 6/142b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 5 ShuffleBytes-ChangeBinInt-InsertByte-InsertByte-InsertRepeatedBytes- 00:07:48.952 [2024-04-24 19:14:35.817981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:21c04e00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.818006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.952 [2024-04-24 19:14:35.818066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.952 [2024-04-24 19:14:35.818081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.818156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.818169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.818228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00c7647f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.818245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.953 #26 NEW cov: 11876 ft: 13795 corp: 7/177b lim: 40 exec/s: 0 rss: 72Mb L: 35/38 MS: 3 EraseBytes-InsertByte-CrossOver- 00:07:48.953 [2024-04-24 19:14:35.857716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7647f cdw11:f26f0900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.857741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.953 #27 NEW cov: 11876 ft: 13857 corp: 8/188b lim: 40 exec/s: 0 rss: 72Mb L: 11/38 MS: 1 CrossOver- 00:07:48.953 [2024-04-24 19:14:35.898215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.898240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.898299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.898313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.898371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.898385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.898458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000fe cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.898472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.953 #28 NEW cov: 11876 ft: 13885 corp: 9/226b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 ChangeBinInt- 00:07:48.953 [2024-04-24 19:14:35.938333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:21c04e00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.938358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.938415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00c50000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.938431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.938486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.938500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.953 [2024-04-24 19:14:35.938556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00c7647f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:48.953 [2024-04-24 19:14:35.938570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.953 #29 NEW cov: 11876 ft: 13922 corp: 10/261b lim: 40 exec/s: 0 rss: 72Mb L: 35/38 MS: 1 ChangeByte- 00:07:49.211 [2024-04-24 19:14:35.978406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:4e000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:35.978432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:35.978491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:35.978505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:35.978564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:35.978578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:35.978636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:35.978650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.211 #30 NEW cov: 11876 ft: 13962 corp: 11/296b lim: 40 exec/s: 0 rss: 72Mb L: 35/38 MS: 1 CrossOver- 00:07:49.211 [2024-04-24 19:14:36.018492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.018518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.018591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.018605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.018666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00270000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.018682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.018743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.018756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.211 #31 NEW cov: 11876 ft: 13986 corp: 12/335b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 InsertByte- 00:07:49.211 [2024-04-24 19:14:36.058523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.058549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.058609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.058623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.058680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.058694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.211 #36 NEW cov: 11876 ft: 14029 corp: 13/361b lim: 40 exec/s: 0 rss: 72Mb L: 26/39 MS: 5 ChangeByte-ShuffleBytes-InsertByte-EraseBytes-InsertRepeatedBytes- 00:07:49.211 [2024-04-24 19:14:36.098408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff02 cdw11:647ff26f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.098433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.211 #37 NEW cov: 11876 ft: 14140 corp: 14/374b lim: 40 exec/s: 0 rss: 72Mb L: 13/39 MS: 1 CMP- DE: "\377\002"- 00:07:49.211 [2024-04-24 19:14:36.148543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff02 cdw11:647ff26f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.148568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.211 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.211 #38 NEW cov: 11899 ft: 14191 corp: 15/387b lim: 40 exec/s: 0 rss: 72Mb L: 13/39 MS: 1 ShuffleBytes- 00:07:49.211 [2024-04-24 19:14:36.199072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.199116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.199184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.199199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.199257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.199272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.211 [2024-04-24 19:14:36.199328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.211 [2024-04-24 19:14:36.199342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.211 #39 NEW cov: 11899 ft: 14239 corp: 16/425b lim: 40 exec/s: 0 rss: 72Mb L: 38/39 MS: 1 ChangeBit- 00:07:49.469 [2024-04-24 19:14:36.238797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff07 cdw11:647ff26f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.469 [2024-04-24 19:14:36.238822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.470 #40 NEW cov: 11899 ft: 14255 corp: 17/438b lim: 40 exec/s: 0 rss: 72Mb L: 13/39 MS: 1 ChangeBinInt- 00:07:49.470 [2024-04-24 19:14:36.279138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.279163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.470 [2024-04-24 19:14:36.279222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.279237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.470 [2024-04-24 19:14:36.279294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000037a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.279307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.470 #41 NEW cov: 11899 ft: 14264 corp: 18/463b lim: 40 exec/s: 41 rss: 73Mb L: 25/39 MS: 1 EraseBytes- 00:07:49.470 [2024-04-24 19:14:36.319027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff07 cdw11:647ff26f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.319051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.470 #42 NEW cov: 11899 ft: 14368 corp: 19/476b lim: 40 exec/s: 42 rss: 73Mb L: 13/39 MS: 1 ChangeByte- 00:07:49.470 [2024-04-24 19:14:36.359187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0000000 cdw11:00c7647f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.359211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.470 #43 NEW cov: 11899 ft: 14457 corp: 20/491b lim: 40 exec/s: 43 rss: 73Mb L: 15/39 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:49.470 [2024-04-24 19:14:36.399268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff02 cdw11:647f316f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.399293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.470 #44 NEW cov: 11899 ft: 14485 corp: 21/504b lim: 40 exec/s: 44 rss: 73Mb L: 13/39 MS: 1 ChangeByte- 00:07:49.470 [2024-04-24 19:14:36.439376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff07 cdw11:647ff26f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.439402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.470 #45 NEW cov: 11899 ft: 14496 corp: 22/515b lim: 40 exec/s: 45 rss: 73Mb L: 11/39 MS: 1 EraseBytes- 00:07:49.470 [2024-04-24 19:14:36.479733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.479759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.470 [2024-04-24 19:14:36.479813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.479831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.470 [2024-04-24 19:14:36.479891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00232323 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.470 [2024-04-24 19:14:36.479905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.728 #46 NEW cov: 11899 ft: 14502 corp: 23/545b lim: 40 exec/s: 46 rss: 73Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:07:49.728 [2024-04-24 19:14:36.520010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:21c04e00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.520036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.728 [2024-04-24 19:14:36.520094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0007647f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.520108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.728 [2024-04-24 19:14:36.520164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.520177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.728 [2024-04-24 19:14:36.520236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.520249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.728 #47 NEW cov: 11899 ft: 14543 corp: 24/583b lim: 40 exec/s: 47 rss: 73Mb L: 38/39 MS: 1 CrossOver- 00:07:49.728 [2024-04-24 19:14:36.559697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac0c764 cdw11:7ff26f09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.559722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.728 #50 NEW cov: 11899 ft: 14561 corp: 25/593b lim: 40 exec/s: 50 rss: 73Mb L: 10/39 MS: 3 InsertByte-ChangeBit-PersAutoDict- DE: "\300\307d\177\362o\011\000"- 00:07:49.728 [2024-04-24 19:14:36.600087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4e000000 cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.600113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.728 [2024-04-24 19:14:36.600171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.600184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.728 [2024-04-24 19:14:36.600239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.600253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.728 #51 NEW cov: 11899 ft: 14565 corp: 26/622b lim: 40 exec/s: 51 rss: 73Mb L: 29/39 MS: 1 ChangeBit- 00:07:49.728 [2024-04-24 19:14:36.650276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0000000 cdw11:00c7647f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.650303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.728 [2024-04-24 19:14:36.650370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f26f0606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.650384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.728 [2024-04-24 19:14:36.650444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:06060606 cdw11:06060606 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.650459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.728 #52 NEW cov: 11899 ft: 14574 corp: 27/653b lim: 40 exec/s: 52 rss: 73Mb L: 31/39 MS: 1 InsertRepeatedBytes- 00:07:49.728 [2024-04-24 19:14:36.700161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0bffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.728 [2024-04-24 19:14:36.700187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.728 #55 NEW cov: 11899 ft: 14586 corp: 28/663b lim: 40 exec/s: 55 rss: 73Mb L: 10/39 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:07:49.729 [2024-04-24 19:14:36.740297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c0c764 cdw11:7ff26f09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.729 [2024-04-24 19:14:36.740323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.987 #57 NEW cov: 11899 ft: 14635 corp: 29/678b lim: 40 exec/s: 57 rss: 73Mb L: 15/39 MS: 2 EraseBytes-PersAutoDict- DE: "\300\307d\177\362o\011\000"- 00:07:49.987 [2024-04-24 19:14:36.790819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.790845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.987 [2024-04-24 19:14:36.790900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.790914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.987 [2024-04-24 19:14:36.790972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.790986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.987 [2024-04-24 19:14:36.791044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00232323 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.791057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.987 #58 NEW cov: 11899 ft: 14646 corp: 30/716b lim: 40 exec/s: 58 rss: 73Mb L: 38/39 MS: 1 CopyPart- 00:07:49.987 [2024-04-24 19:14:36.830573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c7ff0264 cdw11:7ff26f09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.830599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.987 #59 NEW cov: 11899 ft: 14650 corp: 31/729b lim: 40 exec/s: 59 rss: 73Mb L: 13/39 MS: 1 CrossOver- 00:07:49.987 [2024-04-24 19:14:36.870744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff02 cdw11:647f316f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.870770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.987 [2024-04-24 19:14:36.870835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0900ea08 cdw11:c0000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.870849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.987 #60 NEW cov: 11899 ft: 14845 corp: 32/747b lim: 40 exec/s: 60 rss: 73Mb L: 18/39 MS: 1 CrossOver- 00:07:49.987 [2024-04-24 19:14:36.920809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c7ff0264 cdw11:7ff26f09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.920836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.987 #61 NEW cov: 11899 ft: 14847 corp: 33/760b lim: 40 exec/s: 61 rss: 73Mb L: 13/39 MS: 1 PersAutoDict- DE: "\377\002"- 00:07:49.987 [2024-04-24 19:14:36.971199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4ef9ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.971226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.987 [2024-04-24 19:14:36.971287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.971302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.987 [2024-04-24 19:14:36.971359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.987 [2024-04-24 19:14:36.971373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.987 #62 NEW cov: 11899 ft: 14866 corp: 34/789b lim: 40 exec/s: 62 rss: 73Mb L: 29/39 MS: 1 ChangeBinInt- 00:07:50.246 [2024-04-24 19:14:37.011406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:21c04e00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.011433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.011492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00c50000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.011506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.011563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.011577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.011633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000c77f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.011646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.246 #63 NEW cov: 11899 ft: 14906 corp: 35/824b lim: 40 exec/s: 63 rss: 74Mb L: 35/39 MS: 1 CopyPart- 00:07:50.246 [2024-04-24 19:14:37.051505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.051531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.051590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000000fe cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.051608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.051664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.051677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.051734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.051748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.246 #64 NEW cov: 11899 ft: 14910 corp: 36/858b lim: 40 exec/s: 64 rss: 74Mb L: 34/39 MS: 1 ChangeByte- 00:07:50.246 [2024-04-24 19:14:37.091260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:c0c7ff02 cdw11:647f326f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.091285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.246 #65 NEW cov: 11899 ft: 14914 corp: 37/871b lim: 40 exec/s: 65 rss: 74Mb L: 13/39 MS: 1 ChangeASCIIInt- 00:07:50.246 [2024-04-24 19:14:37.131741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4e000000 cdw11:00000022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.131768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.131830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.131844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.131902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.131916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.131972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.131985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.246 #66 NEW cov: 11899 ft: 14918 corp: 38/905b lim: 40 exec/s: 66 rss: 74Mb L: 34/39 MS: 1 ChangeBinInt- 00:07:50.246 [2024-04-24 19:14:37.171735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.171760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.171838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.171853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.246 [2024-04-24 19:14:37.171913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:23232323 cdw11:2300037a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.171929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.246 #67 NEW cov: 11899 ft: 14924 corp: 39/930b lim: 40 exec/s: 67 rss: 74Mb L: 25/39 MS: 1 EraseBytes- 00:07:50.246 [2024-04-24 19:14:37.221615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0000000b cdw11:f26f0900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.246 [2024-04-24 19:14:37.221640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.246 #68 NEW cov: 11899 ft: 14925 corp: 40/941b lim: 40 exec/s: 68 rss: 74Mb L: 11/39 MS: 1 ChangeBinInt- 00:07:50.506 [2024-04-24 19:14:37.262255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.506 [2024-04-24 19:14:37.262282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.506 [2024-04-24 19:14:37.262342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.506 [2024-04-24 19:14:37.262357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.506 [2024-04-24 19:14:37.262416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.506 [2024-04-24 19:14:37.262430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.506 [2024-04-24 19:14:37.262487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.506 [2024-04-24 19:14:37.262501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.506 [2024-04-24 19:14:37.262560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00037a06 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.506 [2024-04-24 19:14:37.262574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.506 #69 NEW cov: 11899 ft: 14964 corp: 41/981b lim: 40 exec/s: 34 rss: 74Mb L: 40/40 MS: 1 CopyPart- 00:07:50.506 #69 DONE cov: 11899 ft: 14964 corp: 41/981b lim: 40 exec/s: 34 rss: 74Mb 00:07:50.506 ###### Recommended dictionary. ###### 00:07:50.506 "\000\000\000\000" # Uses: 1 00:07:50.506 "\300\307d\177\362o\011\000" # Uses: 2 00:07:50.506 "\377\002" # Uses: 1 00:07:50.506 ###### End of recommended dictionary. ###### 00:07:50.506 Done 69 runs in 2 second(s) 00:07:50.506 19:14:37 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:07:50.506 19:14:37 -- ../common.sh@72 -- # (( i++ )) 00:07:50.506 19:14:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.506 19:14:37 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:50.506 19:14:37 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:50.506 19:14:37 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.506 19:14:37 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.506 19:14:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:50.506 19:14:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:50.506 19:14:37 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:50.506 19:14:37 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:50.506 19:14:37 -- nvmf/run.sh@34 -- # printf %02d 11 00:07:50.506 19:14:37 -- nvmf/run.sh@34 -- # port=4411 00:07:50.506 19:14:37 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:50.506 19:14:37 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:50.506 19:14:37 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.506 19:14:37 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:50.506 19:14:37 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:50.506 19:14:37 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:07:50.506 [2024-04-24 19:14:37.457335] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:50.506 [2024-04-24 19:14:37.457411] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623718 ] 00:07:50.506 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.764 [2024-04-24 19:14:37.657290] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.764 [2024-04-24 19:14:37.729924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.021 [2024-04-24 19:14:37.789680] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.022 [2024-04-24 19:14:37.805882] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:51.022 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.022 INFO: Seed: 1426672522 00:07:51.022 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:51.022 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:51.022 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:51.022 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.022 #2 INITED exec/s: 0 rss: 64Mb 00:07:51.022 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.022 This may also happen if the target rejected all inputs we tried so far 00:07:51.022 [2024-04-24 19:14:37.876957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.022 [2024-04-24 19:14:37.876998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.022 [2024-04-24 19:14:37.877109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fafafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.022 [2024-04-24 19:14:37.877126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 NEW_FUNC[1/671]: 0x4908e0 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:51.280 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.280 #10 NEW cov: 11667 ft: 11662 corp: 2/17b lim: 40 exec/s: 0 rss: 70Mb L: 16/16 MS: 3 ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:51.280 [2024-04-24 19:14:38.217142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-24 19:14:38.217194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 [2024-04-24 19:14:38.217302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-24 19:14:38.217322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.280 #12 NEW cov: 11797 ft: 12158 corp: 3/40b lim: 40 exec/s: 0 rss: 71Mb L: 23/23 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:51.280 [2024-04-24 19:14:38.266873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.280 [2024-04-24 19:14:38.266901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.280 #13 NEW cov: 11803 ft: 12967 corp: 4/52b lim: 40 exec/s: 0 rss: 71Mb L: 12/23 MS: 1 EraseBytes- 00:07:51.542 [2024-04-24 19:14:38.327435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.542 [2024-04-24 19:14:38.327462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.542 [2024-04-24 19:14:38.327559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.542 [2024-04-24 19:14:38.327574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.542 #19 NEW cov: 11888 ft: 13288 corp: 5/75b lim: 40 exec/s: 0 rss: 71Mb L: 23/23 MS: 1 ChangeByte- 00:07:51.543 [2024-04-24 19:14:38.387715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.387740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.543 [2024-04-24 19:14:38.387825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fafafaf7 cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.387840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.543 #20 NEW cov: 11888 ft: 13427 corp: 6/91b lim: 40 exec/s: 0 rss: 71Mb L: 16/23 MS: 1 ChangeBinInt- 00:07:51.543 [2024-04-24 19:14:38.437792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.437818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.543 [2024-04-24 19:14:38.437911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:faf7fafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.437927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.543 #26 NEW cov: 11888 ft: 13528 corp: 7/107b lim: 40 exec/s: 0 rss: 72Mb L: 16/23 MS: 1 ShuffleBytes- 00:07:51.543 [2024-04-24 19:14:38.497984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.498010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.543 [2024-04-24 19:14:38.498113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.498129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.543 #27 NEW cov: 11888 ft: 13672 corp: 8/130b lim: 40 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 ShuffleBytes- 00:07:51.543 [2024-04-24 19:14:38.558413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.558438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.543 [2024-04-24 19:14:38.558533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.543 [2024-04-24 19:14:38.558549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.800 #28 NEW cov: 11888 ft: 13770 corp: 9/153b lim: 40 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 ChangeBit- 00:07:51.800 [2024-04-24 19:14:38.608117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fafafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.800 [2024-04-24 19:14:38.608146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.800 #29 NEW cov: 11888 ft: 13807 corp: 10/165b lim: 40 exec/s: 0 rss: 72Mb L: 12/23 MS: 1 CopyPart- 00:07:51.800 [2024-04-24 19:14:38.668764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:5454acab SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.800 [2024-04-24 19:14:38.668789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.800 [2024-04-24 19:14:38.668889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:abababab cdw11:aba25454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.800 [2024-04-24 19:14:38.668904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.800 #30 NEW cov: 11888 ft: 13906 corp: 11/188b lim: 40 exec/s: 0 rss: 72Mb L: 23/23 MS: 1 ChangeBinInt- 00:07:51.800 [2024-04-24 19:14:38.718501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.800 [2024-04-24 19:14:38.718526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.800 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.800 #31 NEW cov: 11911 ft: 13969 corp: 12/201b lim: 40 exec/s: 0 rss: 72Mb L: 13/23 MS: 1 InsertByte- 00:07:51.800 [2024-04-24 19:14:38.769439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.800 [2024-04-24 19:14:38.769464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.800 [2024-04-24 19:14:38.769554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fa545454 cdw11:54542254 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.800 [2024-04-24 19:14:38.769570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.800 [2024-04-24 19:14:38.769676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:51.800 [2024-04-24 19:14:38.769691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.800 #32 NEW cov: 11911 ft: 14201 corp: 13/232b lim: 40 exec/s: 0 rss: 72Mb L: 31/31 MS: 1 CrossOver- 00:07:52.058 [2024-04-24 19:14:38.819985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fafaa5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.058 [2024-04-24 19:14:38.820014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.058 [2024-04-24 19:14:38.820125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.058 [2024-04-24 19:14:38.820141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.059 [2024-04-24 19:14:38.820236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.820253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.059 [2024-04-24 19:14:38.820344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:a5a5fafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.820363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.059 #33 NEW cov: 11911 ft: 14601 corp: 14/268b lim: 40 exec/s: 33 rss: 72Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:52.059 [2024-04-24 19:14:38.879123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:8a41fafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.879153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.059 #35 NEW cov: 11911 ft: 14616 corp: 15/281b lim: 40 exec/s: 35 rss: 72Mb L: 13/36 MS: 2 ChangeBit-CrossOver- 00:07:52.059 [2024-04-24 19:14:38.930119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000001f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.930144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.059 [2024-04-24 19:14:38.930234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fa545454 cdw11:54542254 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.930250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.059 [2024-04-24 19:14:38.930340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.930355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.059 #36 NEW cov: 11911 ft: 14639 corp: 16/312b lim: 40 exec/s: 36 rss: 72Mb L: 31/36 MS: 1 ChangeBinInt- 00:07:52.059 [2024-04-24 19:14:38.990215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.990241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.059 [2024-04-24 19:14:38.990328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:54543154 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.990345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.059 [2024-04-24 19:14:38.990437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:38.990452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.059 #37 NEW cov: 11911 ft: 14672 corp: 17/336b lim: 40 exec/s: 37 rss: 72Mb L: 24/36 MS: 1 InsertByte- 00:07:52.059 [2024-04-24 19:14:39.040032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:39.040063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.059 [2024-04-24 19:14:39.040157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22545454 cdw11:5454acae SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.059 [2024-04-24 19:14:39.040173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.059 #38 NEW cov: 11911 ft: 14702 corp: 18/359b lim: 40 exec/s: 38 rss: 72Mb L: 23/36 MS: 1 ChangeBinInt- 00:07:52.318 [2024-04-24 19:14:39.101027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fafaa5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.101054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.318 [2024-04-24 19:14:39.101154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.101170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.318 [2024-04-24 19:14:39.101263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.101278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.318 [2024-04-24 19:14:39.101370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:a5a5fafa cdw11:54fafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.101385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.318 #39 NEW cov: 11911 ft: 14731 corp: 19/396b lim: 40 exec/s: 39 rss: 72Mb L: 37/37 MS: 1 CrossOver- 00:07:52.318 [2024-04-24 19:14:39.160592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafa9d cdw11:8a775ef4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.160619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.318 [2024-04-24 19:14:39.160709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:6f0900fa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.160725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.318 #40 NEW cov: 11911 ft: 14736 corp: 20/412b lim: 40 exec/s: 40 rss: 72Mb L: 16/37 MS: 1 CMP- DE: "\235\212w^\364o\011\000"- 00:07:52.318 [2024-04-24 19:14:39.210808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.210837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.318 [2024-04-24 19:14:39.210929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fafafafa cdw11:fafa41fa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.210946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.318 #41 NEW cov: 11911 ft: 14752 corp: 21/428b lim: 40 exec/s: 41 rss: 72Mb L: 16/37 MS: 1 ChangeByte- 00:07:52.318 [2024-04-24 19:14:39.260930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.260957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.318 [2024-04-24 19:14:39.261046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.261067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.318 #42 NEW cov: 11911 ft: 14785 corp: 22/451b lim: 40 exec/s: 42 rss: 73Mb L: 23/37 MS: 1 ChangeBinInt- 00:07:52.318 [2024-04-24 19:14:39.311113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.311139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.318 [2024-04-24 19:14:39.311229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:5454549d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.318 [2024-04-24 19:14:39.311248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.318 #43 NEW cov: 11911 ft: 14787 corp: 23/474b lim: 40 exec/s: 43 rss: 73Mb L: 23/37 MS: 1 PersAutoDict- DE: "\235\212w^\364o\011\000"- 00:07:52.577 [2024-04-24 19:14:39.361313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41ffffff cdw11:fffeffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.361340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.577 [2024-04-24 19:14:39.361429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.361446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.577 #44 NEW cov: 11911 ft: 14814 corp: 24/490b lim: 40 exec/s: 44 rss: 73Mb L: 16/37 MS: 1 CMP- DE: "\377\377\377\377\376\377\377\377"- 00:07:52.577 [2024-04-24 19:14:39.411262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.411289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.577 #45 NEW cov: 11911 ft: 14835 corp: 25/504b lim: 40 exec/s: 45 rss: 73Mb L: 14/37 MS: 1 EraseBytes- 00:07:52.577 [2024-04-24 19:14:39.472136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.472162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.577 [2024-04-24 19:14:39.472250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.472265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.577 [2024-04-24 19:14:39.472357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.472372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.577 #46 NEW cov: 11911 ft: 14846 corp: 26/535b lim: 40 exec/s: 46 rss: 73Mb L: 31/37 MS: 1 PersAutoDict- DE: "\377\377\377\377\376\377\377\377"- 00:07:52.577 [2024-04-24 19:14:39.522022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41fa5454 cdw11:54fafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.522048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.577 [2024-04-24 19:14:39.522147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fafafafa cdw11:f7fafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.522163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.577 #47 NEW cov: 11911 ft: 14865 corp: 27/554b lim: 40 exec/s: 47 rss: 73Mb L: 19/37 MS: 1 CrossOver- 00:07:52.577 [2024-04-24 19:14:39.572818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000001f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.572844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.577 [2024-04-24 19:14:39.572935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fa4a5454 cdw11:54545422 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.572953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.577 [2024-04-24 19:14:39.573062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.573076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.577 [2024-04-24 19:14:39.573175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:54fafaf7 cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.577 [2024-04-24 19:14:39.573189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.835 #48 NEW cov: 11911 ft: 14877 corp: 28/586b lim: 40 exec/s: 48 rss: 73Mb L: 32/37 MS: 1 InsertByte- 00:07:52.835 [2024-04-24 19:14:39.632696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:3dffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.835 [2024-04-24 19:14:39.632721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.835 [2024-04-24 19:14:39.632809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:feffffff cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.835 [2024-04-24 19:14:39.632824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.835 [2024-04-24 19:14:39.632920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.835 [2024-04-24 19:14:39.632935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.835 #49 NEW cov: 11911 ft: 14880 corp: 29/617b lim: 40 exec/s: 49 rss: 73Mb L: 31/37 MS: 1 ChangeByte- 00:07:52.835 [2024-04-24 19:14:39.692491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41ffffff cdw11:fffeffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.835 [2024-04-24 19:14:39.692516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.835 [2024-04-24 19:14:39.692608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fffafafa cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.835 [2024-04-24 19:14:39.692622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.835 #50 NEW cov: 11911 ft: 14889 corp: 30/633b lim: 40 exec/s: 50 rss: 73Mb L: 16/37 MS: 1 ShuffleBytes- 00:07:52.835 [2024-04-24 19:14:39.752768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fafafafa cdw11:fafafa41 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.835 [2024-04-24 19:14:39.752794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.835 [2024-04-24 19:14:39.752887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:fafafaf7 cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.835 [2024-04-24 19:14:39.752903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.835 #51 NEW cov: 11911 ft: 14909 corp: 31/649b lim: 40 exec/s: 51 rss: 73Mb L: 16/37 MS: 1 ShuffleBytes- 00:07:52.835 [2024-04-24 19:14:39.802930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:7a545454 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.836 [2024-04-24 19:14:39.802956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.836 [2024-04-24 19:14:39.803049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:22545c54 cdw11:54545454 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.836 [2024-04-24 19:14:39.803072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.836 #52 NEW cov: 11911 ft: 14921 corp: 32/672b lim: 40 exec/s: 52 rss: 73Mb L: 23/37 MS: 1 ChangeBit- 00:07:53.094 [2024-04-24 19:14:39.853445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fafaa5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.094 [2024-04-24 19:14:39.853473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.094 [2024-04-24 19:14:39.853570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a541 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.094 [2024-04-24 19:14:39.853586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.094 [2024-04-24 19:14:39.853680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fafafafa cdw11:faa5faa5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.094 [2024-04-24 19:14:39.853696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.095 #53 NEW cov: 11911 ft: 14931 corp: 33/696b lim: 40 exec/s: 26 rss: 73Mb L: 24/37 MS: 1 CrossOver- 00:07:53.095 #53 DONE cov: 11911 ft: 14931 corp: 33/696b lim: 40 exec/s: 26 rss: 73Mb 00:07:53.095 ###### Recommended dictionary. ###### 00:07:53.095 "\235\212w^\364o\011\000" # Uses: 1 00:07:53.095 "\377\377\377\377\376\377\377\377" # Uses: 1 00:07:53.095 ###### End of recommended dictionary. ###### 00:07:53.095 Done 53 runs in 2 second(s) 00:07:53.095 19:14:39 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:07:53.095 19:14:40 -- ../common.sh@72 -- # (( i++ )) 00:07:53.095 19:14:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.095 19:14:40 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:53.095 19:14:40 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:53.095 19:14:40 -- nvmf/run.sh@24 -- # local timen=1 00:07:53.095 19:14:40 -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.095 19:14:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:53.095 19:14:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:53.095 19:14:40 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:53.095 19:14:40 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:53.095 19:14:40 -- nvmf/run.sh@34 -- # printf %02d 12 00:07:53.095 19:14:40 -- nvmf/run.sh@34 -- # port=4412 00:07:53.095 19:14:40 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:53.095 19:14:40 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:53.095 19:14:40 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.095 19:14:40 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:53.095 19:14:40 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:53.095 19:14:40 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:07:53.095 [2024-04-24 19:14:40.042368] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:53.095 [2024-04-24 19:14:40.042450] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624072 ] 00:07:53.095 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.353 [2024-04-24 19:14:40.344473] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.612 [2024-04-24 19:14:40.442003] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.612 [2024-04-24 19:14:40.501936] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.612 [2024-04-24 19:14:40.518140] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:53.612 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.612 INFO: Seed: 4139692603 00:07:53.612 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:53.612 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:53.612 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:53.612 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.612 #2 INITED exec/s: 0 rss: 64Mb 00:07:53.612 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.612 This may also happen if the target rejected all inputs we tried so far 00:07:53.612 [2024-04-24 19:14:40.585160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.612 [2024-04-24 19:14:40.585207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.612 [2024-04-24 19:14:40.585325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.612 [2024-04-24 19:14:40.585345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 NEW_FUNC[1/671]: 0x492650 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:54.179 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.179 #5 NEW cov: 11665 ft: 11666 corp: 2/21b lim: 40 exec/s: 0 rss: 71Mb L: 20/20 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:07:54.179 [2024-04-24 19:14:40.914541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:40.914585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:40.914644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:40.914660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 #21 NEW cov: 11795 ft: 12102 corp: 3/42b lim: 40 exec/s: 0 rss: 71Mb L: 21/21 MS: 1 CrossOver- 00:07:54.179 [2024-04-24 19:14:40.964887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:40.964914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:40.964986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:40.965001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:40.965054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:40.965073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:40.965127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000008c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:40.965144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.179 #22 NEW cov: 11801 ft: 12705 corp: 4/81b lim: 40 exec/s: 0 rss: 71Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:54.179 [2024-04-24 19:14:41.004634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.004661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:41.004717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.004731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 #23 NEW cov: 11886 ft: 13053 corp: 5/101b lim: 40 exec/s: 0 rss: 71Mb L: 20/39 MS: 1 EraseBytes- 00:07:54.179 [2024-04-24 19:14:41.054818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c0a cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.054843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:41.054914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.054929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 #24 NEW cov: 11886 ft: 13175 corp: 6/118b lim: 40 exec/s: 0 rss: 72Mb L: 17/39 MS: 1 EraseBytes- 00:07:54.179 [2024-04-24 19:14:41.094941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c848c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.094967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:41.095023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.095038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 #25 NEW cov: 11886 ft: 13296 corp: 7/138b lim: 40 exec/s: 0 rss: 72Mb L: 20/39 MS: 1 ChangeBit- 00:07:54.179 [2024-04-24 19:14:41.135033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c0a cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.135063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:41.135132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.135146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 #26 NEW cov: 11886 ft: 13364 corp: 8/155b lim: 40 exec/s: 0 rss: 72Mb L: 17/39 MS: 1 ShuffleBytes- 00:07:54.179 [2024-04-24 19:14:41.185515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.185539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:41.185594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.185608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:41.185677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.185691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.179 [2024-04-24 19:14:41.185745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000008c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.179 [2024-04-24 19:14:41.185759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.438 #27 NEW cov: 11886 ft: 13396 corp: 9/194b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 CopyPart- 00:07:54.438 [2024-04-24 19:14:41.225457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8c8c8c8c cdw11:8c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.225482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.225538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00007e8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.225553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.225607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c8c0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.225621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.438 #28 NEW cov: 11886 ft: 13601 corp: 10/224b lim: 40 exec/s: 0 rss: 72Mb L: 30/39 MS: 1 CopyPart- 00:07:54.438 [2024-04-24 19:14:41.275548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.275573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.275627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.275641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.275697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.275711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.438 #29 NEW cov: 11886 ft: 13649 corp: 11/253b lim: 40 exec/s: 0 rss: 72Mb L: 29/39 MS: 1 EraseBytes- 00:07:54.438 [2024-04-24 19:14:41.315486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c848c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.315510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.315566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.315580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.438 #30 NEW cov: 11886 ft: 13684 corp: 12/273b lim: 40 exec/s: 0 rss: 72Mb L: 20/39 MS: 1 CrossOver- 00:07:54.438 [2024-04-24 19:14:41.355786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8cffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.355814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.355870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.355884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.355935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.355948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.438 #31 NEW cov: 11886 ft: 13724 corp: 13/302b lim: 40 exec/s: 0 rss: 72Mb L: 29/39 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:54.438 [2024-04-24 19:14:41.395897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.395922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.395978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000007e cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.395992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.396048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:848c8c8c cdw11:008c8c00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.396065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.438 #32 NEW cov: 11886 ft: 13743 corp: 14/326b lim: 40 exec/s: 0 rss: 72Mb L: 24/39 MS: 1 CrossOver- 00:07:54.438 [2024-04-24 19:14:41.435836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c0a cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.435860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.438 [2024-04-24 19:14:41.435931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c7e8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.438 [2024-04-24 19:14:41.435946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.698 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.698 #33 NEW cov: 11909 ft: 13800 corp: 15/343b lim: 40 exec/s: 0 rss: 72Mb L: 17/39 MS: 1 CrossOver- 00:07:54.698 [2024-04-24 19:14:41.476275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.476300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.476356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.476370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.476423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c0f0f0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.476436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.476493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.476507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.698 #34 NEW cov: 11909 ft: 13833 corp: 16/380b lim: 40 exec/s: 0 rss: 72Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:07:54.698 [2024-04-24 19:14:41.516219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c848c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.516244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.516316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.516330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.516385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:008c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.516398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.698 #35 NEW cov: 11909 ft: 13863 corp: 17/410b lim: 40 exec/s: 0 rss: 72Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:07:54.698 [2024-04-24 19:14:41.556192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c848c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.556216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.556286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.556301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.698 #36 NEW cov: 11909 ft: 13934 corp: 18/430b lim: 40 exec/s: 36 rss: 72Mb L: 20/39 MS: 1 ChangeBinInt- 00:07:54.698 [2024-04-24 19:14:41.596174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.596199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.698 #37 NEW cov: 11909 ft: 14653 corp: 19/439b lim: 40 exec/s: 37 rss: 72Mb L: 9/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:54.698 [2024-04-24 19:14:41.636732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.636757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.636828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.636843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.636896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.636909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.636962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:8c000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.636978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.698 #38 NEW cov: 11909 ft: 14660 corp: 20/476b lim: 40 exec/s: 38 rss: 72Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:07:54.698 [2024-04-24 19:14:41.676554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8cac8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.676578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.676632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.676646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.698 #39 NEW cov: 11909 ft: 14679 corp: 21/496b lim: 40 exec/s: 39 rss: 72Mb L: 20/39 MS: 1 ChangeBit- 00:07:54.698 [2024-04-24 19:14:41.706771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.706795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.706851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000007e cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.706865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.698 [2024-04-24 19:14:41.706935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:7e8c8c0a cdw11:848c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.698 [2024-04-24 19:14:41.706949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.957 #40 NEW cov: 11909 ft: 14710 corp: 22/524b lim: 40 exec/s: 40 rss: 72Mb L: 28/39 MS: 1 CrossOver- 00:07:54.957 [2024-04-24 19:14:41.747072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8cffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.747099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.747170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.747186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.747242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.747256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.957 #41 NEW cov: 11909 ft: 14843 corp: 23/555b lim: 40 exec/s: 41 rss: 73Mb L: 31/39 MS: 1 CMP- DE: "\007\000"- 00:07:54.957 [2024-04-24 19:14:41.787078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c848c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.787121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.787178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c7e8c8c cdw11:0a8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.787192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.787248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.787265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.957 #42 NEW cov: 11909 ft: 14846 corp: 24/586b lim: 40 exec/s: 42 rss: 73Mb L: 31/39 MS: 1 CrossOver- 00:07:54.957 [2024-04-24 19:14:41.827201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8c8c8c8c cdw11:8c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.827226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.827282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00007e8c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.827296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.827351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.827364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.957 #43 NEW cov: 11909 ft: 14857 corp: 25/616b lim: 40 exec/s: 43 rss: 73Mb L: 30/39 MS: 1 CrossOver- 00:07:54.957 [2024-04-24 19:14:41.867285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c848c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.867309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.867381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c7e8c8c cdw11:0a8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.867395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.867448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.867461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.957 #44 NEW cov: 11909 ft: 14896 corp: 26/647b lim: 40 exec/s: 44 rss: 73Mb L: 31/39 MS: 1 ShuffleBytes- 00:07:54.957 [2024-04-24 19:14:41.917431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8c8c8c8c cdw11:8c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.917456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.917509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00007e8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.917523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.917576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c8c0000 cdw11:27000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.917589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.957 #45 NEW cov: 11909 ft: 14903 corp: 27/678b lim: 40 exec/s: 45 rss: 73Mb L: 31/39 MS: 1 InsertByte- 00:07:54.957 [2024-04-24 19:14:41.957482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c3d8c cdw11:8c8c848c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.957508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.957581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.957595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.957 [2024-04-24 19:14:41.957648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.957 [2024-04-24 19:14:41.957662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.216 #46 NEW cov: 11909 ft: 14917 corp: 28/709b lim: 40 exec/s: 46 rss: 73Mb L: 31/39 MS: 1 InsertByte- 00:07:55.216 [2024-04-24 19:14:42.007673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.007699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.007756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.007770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.007822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.007835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.216 #47 NEW cov: 11909 ft: 14931 corp: 29/738b lim: 40 exec/s: 47 rss: 73Mb L: 29/39 MS: 1 CopyPart- 00:07:55.216 [2024-04-24 19:14:42.047735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.047760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.047816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000007e cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.047830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.047882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:848c8c8c cdw11:008c8c00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.047895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.216 #48 NEW cov: 11909 ft: 14947 corp: 30/762b lim: 40 exec/s: 48 rss: 73Mb L: 24/39 MS: 1 ShuffleBytes- 00:07:55.216 [2024-04-24 19:14:42.087859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c848c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.087884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.087954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c7e8c8c cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.087969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.088021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.088034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.216 #49 NEW cov: 11909 ft: 15017 corp: 31/793b lim: 40 exec/s: 49 rss: 73Mb L: 31/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:55.216 [2024-04-24 19:14:42.128202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.128228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.128300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.128315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.128369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c0f0f0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.128382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.128435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0f000000 cdw11:250f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.128448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.216 #50 NEW cov: 11909 ft: 15032 corp: 32/830b lim: 40 exec/s: 50 rss: 73Mb L: 37/39 MS: 1 ChangeBinInt- 00:07:55.216 [2024-04-24 19:14:42.167970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c7473 cdw11:737373f5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.167995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.168070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:736c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.168085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.216 #51 NEW cov: 11909 ft: 15041 corp: 33/851b lim: 40 exec/s: 51 rss: 73Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:55.216 [2024-04-24 19:14:42.208394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.208419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.208474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000007e cdw11:8cf6f6f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.216 [2024-04-24 19:14:42.208487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.216 [2024-04-24 19:14:42.208557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:f6f6f6f6 cdw11:f6f6f6f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.217 [2024-04-24 19:14:42.208571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.217 [2024-04-24 19:14:42.208625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:f6f6f6f6 cdw11:8c8c8c84 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.217 [2024-04-24 19:14:42.208639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.475 #52 NEW cov: 11909 ft: 15050 corp: 34/890b lim: 40 exec/s: 52 rss: 73Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:55.475 [2024-04-24 19:14:42.258065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:8c747eff cdw11:086ff607 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.258092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.475 #55 NEW cov: 11909 ft: 15097 corp: 35/903b lim: 40 exec/s: 55 rss: 73Mb L: 13/39 MS: 3 CrossOver-CopyPart-CMP- DE: "\377\010o\366\007~\274\244"- 00:07:55.475 [2024-04-24 19:14:42.298484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.298509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.298580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.298595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.298649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.298662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.475 #56 NEW cov: 11909 ft: 15110 corp: 36/932b lim: 40 exec/s: 56 rss: 73Mb L: 29/39 MS: 1 ChangeBit- 00:07:55.475 [2024-04-24 19:14:42.338808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.338833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.338901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000007e cdw11:8cf6f6f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.338916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.338968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:f6f6f6f6 cdw11:f6f6f6f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.338982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.339033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:f6f6f60e cdw11:73737384 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.339047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.378921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8cac8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.378945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.379015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000007e cdw11:8cf6f6f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.379030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.379080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:f6f6f6f6 cdw11:f6f6f6f6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.379094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.379147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:f6f6f60e cdw11:73737384 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.379164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.475 #58 NEW cov: 11909 ft: 15126 corp: 37/971b lim: 40 exec/s: 58 rss: 73Mb L: 39/39 MS: 2 ChangeBinInt-ChangeBit- 00:07:55.475 [2024-04-24 19:14:42.418981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.419006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.419063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.419077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.419158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c0f0f0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.419171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.475 [2024-04-24 19:14:42.419224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.475 [2024-04-24 19:14:42.419237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.475 #59 NEW cov: 11909 ft: 15154 corp: 38/1008b lim: 40 exec/s: 59 rss: 73Mb L: 37/39 MS: 1 ShuffleBytes- 00:07:55.476 [2024-04-24 19:14:42.459100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.476 [2024-04-24 19:14:42.459125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.476 [2024-04-24 19:14:42.459193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:8c8c8c8c cdw11:250f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.476 [2024-04-24 19:14:42.459207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.476 [2024-04-24 19:14:42.459262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0f8c8c0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.476 [2024-04-24 19:14:42.459277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.476 [2024-04-24 19:14:42.459330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0f000000 cdw11:250f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.476 [2024-04-24 19:14:42.459343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.476 #60 NEW cov: 11909 ft: 15159 corp: 39/1045b lim: 40 exec/s: 60 rss: 73Mb L: 37/39 MS: 1 CopyPart- 00:07:55.734 [2024-04-24 19:14:42.509279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-04-24 19:14:42.509304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 [2024-04-24 19:14:42.509360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-04-24 19:14:42.509374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.734 [2024-04-24 19:14:42.509428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-04-24 19:14:42.509444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.734 [2024-04-24 19:14:42.509498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000008c cdw11:8ccc8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-04-24 19:14:42.509512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.734 #61 NEW cov: 11909 ft: 15169 corp: 40/1084b lim: 40 exec/s: 61 rss: 73Mb L: 39/39 MS: 1 ChangeBit- 00:07:55.734 [2024-04-24 19:14:42.549380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7e8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-04-24 19:14:42.549405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 [2024-04-24 19:14:42.549462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0a8c8c8c cdw11:8c8c8c8c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-04-24 19:14:42.549476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.734 [2024-04-24 19:14:42.549530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:8c8c0f0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.735 [2024-04-24 19:14:42.549543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.735 [2024-04-24 19:14:42.549598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0f0f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.735 [2024-04-24 19:14:42.549611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.735 #62 NEW cov: 11909 ft: 15179 corp: 41/1122b lim: 40 exec/s: 31 rss: 73Mb L: 38/39 MS: 1 CrossOver- 00:07:55.735 #62 DONE cov: 11909 ft: 15179 corp: 41/1122b lim: 40 exec/s: 31 rss: 73Mb 00:07:55.735 ###### Recommended dictionary. ###### 00:07:55.735 "\377\377\377\377\377\377\377\377" # Uses: 2 00:07:55.735 "\007\000" # Uses: 0 00:07:55.735 "\377\010o\366\007~\274\244" # Uses: 0 00:07:55.735 ###### End of recommended dictionary. ###### 00:07:55.735 Done 62 runs in 2 second(s) 00:07:55.735 19:14:42 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:07:55.735 19:14:42 -- ../common.sh@72 -- # (( i++ )) 00:07:55.735 19:14:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.735 19:14:42 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:55.735 19:14:42 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:55.735 19:14:42 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.735 19:14:42 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.735 19:14:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:55.735 19:14:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:55.735 19:14:42 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:55.735 19:14:42 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:55.735 19:14:42 -- nvmf/run.sh@34 -- # printf %02d 13 00:07:55.735 19:14:42 -- nvmf/run.sh@34 -- # port=4413 00:07:55.735 19:14:42 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:55.735 19:14:42 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:55.735 19:14:42 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.735 19:14:42 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:55.735 19:14:42 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:55.735 19:14:42 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:07:55.993 [2024-04-24 19:14:42.759688] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:55.993 [2024-04-24 19:14:42.759781] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624431 ] 00:07:55.993 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.252 [2024-04-24 19:14:43.077496] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.252 [2024-04-24 19:14:43.162661] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.252 [2024-04-24 19:14:43.222544] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.252 [2024-04-24 19:14:43.238742] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:56.252 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.252 INFO: Seed: 2566717346 00:07:56.511 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:56.511 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:56.511 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:56.511 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.511 #2 INITED exec/s: 0 rss: 64Mb 00:07:56.511 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.511 This may also happen if the target rejected all inputs we tried so far 00:07:56.511 [2024-04-24 19:14:43.284031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:5ca54700 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.511 [2024-04-24 19:14:43.284067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.769 NEW_FUNC[1/667]: 0x494210 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:56.769 NEW_FUNC[2/667]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:56.769 #6 NEW cov: 11608 ft: 11646 corp: 2/11b lim: 40 exec/s: 0 rss: 71Mb L: 10/10 MS: 4 ChangeByte-InsertByte-ChangeByte-CMP- DE: "G\000\000\000\000\000\000\000"- 00:07:56.769 [2024-04-24 19:14:43.614994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.769 [2024-04-24 19:14:43.615039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.769 [2024-04-24 19:14:43.615118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.769 [2024-04-24 19:14:43.615134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.770 NEW_FUNC[1/3]: 0x19bc420 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:07:56.770 NEW_FUNC[2/3]: 0x19bdb70 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:07:56.770 #8 NEW cov: 11783 ft: 12515 corp: 3/30b lim: 40 exec/s: 0 rss: 71Mb L: 19/19 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:56.770 [2024-04-24 19:14:43.655016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ababa2c cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.770 [2024-04-24 19:14:43.655046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.770 [2024-04-24 19:14:43.655109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.770 [2024-04-24 19:14:43.655127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.770 #9 NEW cov: 11789 ft: 12707 corp: 4/49b lim: 40 exec/s: 0 rss: 71Mb L: 19/19 MS: 1 ChangeByte- 00:07:56.770 [2024-04-24 19:14:43.695145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.770 [2024-04-24 19:14:43.695171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.770 [2024-04-24 19:14:43.695248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ba4fbaba cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.770 [2024-04-24 19:14:43.695263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.770 #10 NEW cov: 11874 ft: 12950 corp: 5/68b lim: 40 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ChangeByte- 00:07:56.770 [2024-04-24 19:14:43.735307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.770 [2024-04-24 19:14:43.735333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.770 [2024-04-24 19:14:43.735390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.770 [2024-04-24 19:14:43.735404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.770 #11 NEW cov: 11874 ft: 12991 corp: 6/87b lim: 40 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ShuffleBytes- 00:07:56.770 [2024-04-24 19:14:43.775262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.770 [2024-04-24 19:14:43.775288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.028 #13 NEW cov: 11874 ft: 13063 corp: 7/98b lim: 40 exec/s: 0 rss: 72Mb L: 11/19 MS: 2 CrossOver-CrossOver- 00:07:57.028 [2024-04-24 19:14:43.815465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ababa00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.028 [2024-04-24 19:14:43.815490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.028 [2024-04-24 19:14:43.815562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0003ffba cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.815577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.029 #14 NEW cov: 11874 ft: 13119 corp: 8/117b lim: 40 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 CMP- DE: "\000\000\000\000\000\000\003\377"- 00:07:57.029 [2024-04-24 19:14:43.865524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2c000200 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.865549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.029 #17 NEW cov: 11874 ft: 13150 corp: 9/128b lim: 40 exec/s: 0 rss: 72Mb L: 11/19 MS: 3 ChangeByte-CMP-CMP- DE: "\000\003"-"\002\000\000\000\000\000\000\000"- 00:07:57.029 [2024-04-24 19:14:43.905755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.905780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.029 [2024-04-24 19:14:43.905859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:9abababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.905876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.029 #18 NEW cov: 11874 ft: 13161 corp: 10/147b lim: 40 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 ChangeBit- 00:07:57.029 [2024-04-24 19:14:43.946067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ababa2c cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.946092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.029 [2024-04-24 19:14:43.946167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:baba2cba cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.946181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.029 [2024-04-24 19:14:43.946240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.946254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.029 [2024-04-24 19:14:43.946313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.946327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.029 #19 NEW cov: 11874 ft: 13720 corp: 11/179b lim: 40 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 CopyPart- 00:07:57.029 [2024-04-24 19:14:43.986210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.986235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.029 [2024-04-24 19:14:43.986312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.986326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.029 [2024-04-24 19:14:43.986387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bababa9a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.986401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.029 [2024-04-24 19:14:43.986456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:bababa9a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:43.986469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.029 #20 NEW cov: 11874 ft: 13749 corp: 12/213b lim: 40 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 CopyPart- 00:07:57.029 [2024-04-24 19:14:44.035974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:5ca50200 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.029 [2024-04-24 19:14:44.035999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.288 #21 NEW cov: 11874 ft: 13849 corp: 13/223b lim: 40 exec/s: 0 rss: 72Mb L: 10/34 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:57.288 [2024-04-24 19:14:44.076454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.076483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.076558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.076573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.076627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bababaff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.076640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.076699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:9abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.076713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.288 #22 NEW cov: 11874 ft: 13905 corp: 14/262b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:57.288 [2024-04-24 19:14:44.126480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.126506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.126567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.126583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.126640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.126654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.288 #23 NEW cov: 11874 ft: 14089 corp: 15/292b lim: 40 exec/s: 0 rss: 72Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:07:57.288 [2024-04-24 19:14:44.166635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.166662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.166723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.166737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.166794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.166807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.288 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.288 #24 NEW cov: 11897 ft: 14150 corp: 16/322b lim: 40 exec/s: 0 rss: 72Mb L: 30/39 MS: 1 CrossOver- 00:07:57.288 [2024-04-24 19:14:44.216742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.216768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.216830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:bababa0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.216844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.216904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:000000ba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.216918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.288 #25 NEW cov: 11897 ft: 14155 corp: 17/348b lim: 40 exec/s: 0 rss: 72Mb L: 26/39 MS: 1 CrossOver- 00:07:57.288 [2024-04-24 19:14:44.256854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aba0aba cdw11:ba2cbaba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.288 [2024-04-24 19:14:44.256880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.288 [2024-04-24 19:14:44.256956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2cbababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.289 [2024-04-24 19:14:44.256971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.289 [2024-04-24 19:14:44.257029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.289 [2024-04-24 19:14:44.257043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.289 #26 NEW cov: 11897 ft: 14170 corp: 18/372b lim: 40 exec/s: 26 rss: 72Mb L: 24/39 MS: 1 CopyPart- 00:07:57.289 [2024-04-24 19:14:44.296714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.289 [2024-04-24 19:14:44.296740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.547 #27 NEW cov: 11897 ft: 14193 corp: 19/383b lim: 40 exec/s: 27 rss: 72Mb L: 11/39 MS: 1 ChangeByte- 00:07:57.548 [2024-04-24 19:14:44.337079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aba0aba cdw11:ba2cbaba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.337105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.337180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2cbababa cdw11:babaaaba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.337195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.337252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.337265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.548 #28 NEW cov: 11897 ft: 14200 corp: 20/407b lim: 40 exec/s: 28 rss: 72Mb L: 24/39 MS: 1 ChangeBit- 00:07:57.548 [2024-04-24 19:14:44.377074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ababa2c cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.377100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.377157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:25bababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.377173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.548 #29 NEW cov: 11897 ft: 14215 corp: 21/427b lim: 40 exec/s: 29 rss: 72Mb L: 20/39 MS: 1 InsertByte- 00:07:57.548 [2024-04-24 19:14:44.417446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.417472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.417547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.417563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.417621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bababaff cdw11:fffffeff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.417634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.417691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:9abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.417705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.548 #30 NEW cov: 11897 ft: 14246 corp: 22/466b lim: 40 exec/s: 30 rss: 73Mb L: 39/39 MS: 1 ChangeBit- 00:07:57.548 [2024-04-24 19:14:44.467196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.467222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.548 #31 NEW cov: 11897 ft: 14330 corp: 23/477b lim: 40 exec/s: 31 rss: 73Mb L: 11/39 MS: 1 ShuffleBytes- 00:07:57.548 [2024-04-24 19:14:44.507533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.507557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.507635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.507649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.507705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:2cbababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.507718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.548 #32 NEW cov: 11897 ft: 14345 corp: 24/507b lim: 40 exec/s: 32 rss: 73Mb L: 30/39 MS: 1 ChangeByte- 00:07:57.548 [2024-04-24 19:14:44.547644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:860aba0a cdw11:baba2cba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.547670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.547730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ba2cbaba cdw11:bababaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.547745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.548 [2024-04-24 19:14:44.547805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.548 [2024-04-24 19:14:44.547819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.808 #33 NEW cov: 11897 ft: 14370 corp: 25/532b lim: 40 exec/s: 33 rss: 73Mb L: 25/39 MS: 1 InsertByte- 00:07:57.808 [2024-04-24 19:14:44.587634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:5ca50200 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.587660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.808 [2024-04-24 19:14:44.587720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00004700 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.587734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.808 #34 NEW cov: 11897 ft: 14377 corp: 26/550b lim: 40 exec/s: 34 rss: 73Mb L: 18/39 MS: 1 PersAutoDict- DE: "G\000\000\000\000\000\000\000"- 00:07:57.808 [2024-04-24 19:14:44.637703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.637728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.808 #35 NEW cov: 11897 ft: 14403 corp: 27/561b lim: 40 exec/s: 35 rss: 73Mb L: 11/39 MS: 1 CopyPart- 00:07:57.808 [2024-04-24 19:14:44.677810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babac3ba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.677836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.808 #36 NEW cov: 11897 ft: 14412 corp: 28/572b lim: 40 exec/s: 36 rss: 73Mb L: 11/39 MS: 1 ChangeBinInt- 00:07:57.808 [2024-04-24 19:14:44.708184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.708210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.808 [2024-04-24 19:14:44.708285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.708300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.808 [2024-04-24 19:14:44.708358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bababab3 cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.708372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.808 #37 NEW cov: 11897 ft: 14426 corp: 29/602b lim: 40 exec/s: 37 rss: 73Mb L: 30/39 MS: 1 ChangeBinInt- 00:07:57.808 [2024-04-24 19:14:44.747987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ababa2b cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.748013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.808 #38 NEW cov: 11897 ft: 14491 corp: 30/613b lim: 40 exec/s: 38 rss: 73Mb L: 11/39 MS: 1 ChangeByte- 00:07:57.808 [2024-04-24 19:14:44.788479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:47000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.788505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.808 [2024-04-24 19:14:44.788584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.788599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.808 [2024-04-24 19:14:44.788659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bababaff cdw11:fffffeff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.788672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.808 [2024-04-24 19:14:44.788732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:9abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.808 [2024-04-24 19:14:44.788746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.808 #39 NEW cov: 11897 ft: 14498 corp: 31/652b lim: 40 exec/s: 39 rss: 73Mb L: 39/39 MS: 1 PersAutoDict- DE: "G\000\000\000\000\000\000\000"- 00:07:58.067 [2024-04-24 19:14:44.838389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ababa00 cdw11:373a6268 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.838415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.838478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f76f0900 cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.838492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.067 #40 NEW cov: 11897 ft: 14545 corp: 32/671b lim: 40 exec/s: 40 rss: 73Mb L: 19/39 MS: 1 CMP- DE: "7:bh\367o\011\000"- 00:07:58.067 [2024-04-24 19:14:44.878728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.878754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.878811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ba0000ba cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.878825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.878883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babaffff cdw11:fffeff9a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.878897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.878955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.878969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.067 #41 NEW cov: 11897 ft: 14549 corp: 33/708b lim: 40 exec/s: 41 rss: 73Mb L: 37/39 MS: 1 CrossOver- 00:07:58.067 [2024-04-24 19:14:44.918619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ababa00 cdw11:373a6268 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.918644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.918721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f76f0900 cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.918738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.067 #42 NEW cov: 11897 ft: 14552 corp: 34/728b lim: 40 exec/s: 42 rss: 73Mb L: 20/39 MS: 1 InsertByte- 00:07:58.067 [2024-04-24 19:14:44.958845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aba0aba cdw11:ba2cbaba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.958870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.958945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2c0ababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.958959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.959021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:baaababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.959034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.067 #43 NEW cov: 11897 ft: 14555 corp: 35/755b lim: 40 exec/s: 43 rss: 73Mb L: 27/39 MS: 1 CrossOver- 00:07:58.067 [2024-04-24 19:14:44.998900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0ac30aba cdw11:ba2cbaba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.998925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.998984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2cbababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.998997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:44.999056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:44.999074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.067 #44 NEW cov: 11897 ft: 14571 corp: 36/779b lim: 40 exec/s: 44 rss: 73Mb L: 24/39 MS: 1 ChangeBinInt- 00:07:58.067 [2024-04-24 19:14:45.039229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:45.039253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:45.039328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:45.039343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:45.039401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:babababa cdw11:bab2813d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:45.039414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.067 [2024-04-24 19:14:45.039470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a4b37f00 cdw11:00bababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.067 [2024-04-24 19:14:45.039484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.067 #45 NEW cov: 11897 ft: 14602 corp: 37/817b lim: 40 exec/s: 45 rss: 73Mb L: 38/39 MS: 1 CMP- DE: "\262\201=\244\263\177\000\000"- 00:07:58.067 [2024-04-24 19:14:45.079193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2c000000 cdw11:00ababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.068 [2024-04-24 19:14:45.079219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.068 [2024-04-24 19:14:45.079281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:abababab cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.068 [2024-04-24 19:14:45.079295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.068 [2024-04-24 19:14:45.079354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:abababab cdw11:ababab00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.068 [2024-04-24 19:14:45.079367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.326 #46 NEW cov: 11897 ft: 14651 corp: 38/846b lim: 40 exec/s: 46 rss: 73Mb L: 29/39 MS: 1 InsertRepeatedBytes- 00:07:58.326 [2024-04-24 19:14:45.119430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.326 [2024-04-24 19:14:45.119455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.326 [2024-04-24 19:14:45.119532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.326 [2024-04-24 19:14:45.119547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.326 [2024-04-24 19:14:45.119608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:bababa0c cdw11:000000ba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.119622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.327 [2024-04-24 19:14:45.119682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:bababa9a cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.119696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.327 #47 NEW cov: 11897 ft: 14662 corp: 39/880b lim: 40 exec/s: 47 rss: 73Mb L: 34/39 MS: 1 CMP- DE: "\014\000\000\000"- 00:07:58.327 [2024-04-24 19:14:45.159138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aba0c00 cdw11:0000baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.159165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.327 #48 NEW cov: 11897 ft: 14664 corp: 40/891b lim: 40 exec/s: 48 rss: 73Mb L: 11/39 MS: 1 PersAutoDict- DE: "\014\000\000\000"- 00:07:58.327 [2024-04-24 19:14:45.199291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:5ca50200 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.199317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.327 #49 NEW cov: 11897 ft: 14668 corp: 41/901b lim: 40 exec/s: 49 rss: 73Mb L: 10/39 MS: 1 ShuffleBytes- 00:07:58.327 [2024-04-24 19:14:45.239668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.239694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.327 [2024-04-24 19:14:45.239770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:bababab2 cdw11:813da4b3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.239787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.327 [2024-04-24 19:14:45.239847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7f0000ba cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.239860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.327 #50 NEW cov: 11897 ft: 14670 corp: 42/929b lim: 40 exec/s: 50 rss: 74Mb L: 28/39 MS: 1 EraseBytes- 00:07:58.327 [2024-04-24 19:14:45.279877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0abababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.279904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.327 [2024-04-24 19:14:45.279967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:babababa cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.279982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.327 [2024-04-24 19:14:45.280044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:baba0c00 cdw11:0000baba SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.280057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.327 [2024-04-24 19:14:45.280142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:baba9aba cdw11:babababa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.327 [2024-04-24 19:14:45.280157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.327 #51 NEW cov: 11897 ft: 14693 corp: 43/962b lim: 40 exec/s: 25 rss: 74Mb L: 33/39 MS: 1 EraseBytes- 00:07:58.327 #51 DONE cov: 11897 ft: 14693 corp: 43/962b lim: 40 exec/s: 25 rss: 74Mb 00:07:58.327 ###### Recommended dictionary. ###### 00:07:58.327 "G\000\000\000\000\000\000\000" # Uses: 2 00:07:58.327 "\000\000\000\000\000\000\003\377" # Uses: 0 00:07:58.327 "\000\003" # Uses: 0 00:07:58.327 "\002\000\000\000\000\000\000\000" # Uses: 1 00:07:58.327 "7:bh\367o\011\000" # Uses: 0 00:07:58.327 "\262\201=\244\263\177\000\000" # Uses: 0 00:07:58.327 "\014\000\000\000" # Uses: 1 00:07:58.327 ###### End of recommended dictionary. ###### 00:07:58.327 Done 51 runs in 2 second(s) 00:07:58.585 19:14:45 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:07:58.585 19:14:45 -- ../common.sh@72 -- # (( i++ )) 00:07:58.585 19:14:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.585 19:14:45 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:58.585 19:14:45 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:58.585 19:14:45 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.585 19:14:45 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.585 19:14:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:58.585 19:14:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:58.585 19:14:45 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:58.585 19:14:45 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:58.585 19:14:45 -- nvmf/run.sh@34 -- # printf %02d 14 00:07:58.586 19:14:45 -- nvmf/run.sh@34 -- # port=4414 00:07:58.586 19:14:45 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:58.586 19:14:45 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:58.586 19:14:45 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.586 19:14:45 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:58.586 19:14:45 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:58.586 19:14:45 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:07:58.586 [2024-04-24 19:14:45.494531] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:07:58.586 [2024-04-24 19:14:45.494608] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624793 ] 00:07:58.586 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.844 [2024-04-24 19:14:45.809933] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.102 [2024-04-24 19:14:45.895821] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.102 [2024-04-24 19:14:45.955368] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.102 [2024-04-24 19:14:45.971578] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:59.102 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.102 INFO: Seed: 1003742721 00:07:59.102 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:07:59.102 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:07:59.102 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:59.102 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.102 #2 INITED exec/s: 0 rss: 64Mb 00:07:59.102 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.102 This may also happen if the target rejected all inputs we tried so far 00:07:59.102 [2024-04-24 19:14:46.050001] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.102 [2024-04-24 19:14:46.050056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.102 [2024-04-24 19:14:46.050212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.102 [2024-04-24 19:14:46.050240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.102 [2024-04-24 19:14:46.050390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.102 [2024-04-24 19:14:46.050423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.102 [2024-04-24 19:14:46.050573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.102 [2024-04-24 19:14:46.050607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.360 NEW_FUNC[1/671]: 0x495dd0 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:59.360 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.360 #4 NEW cov: 11647 ft: 11648 corp: 2/32b lim: 35 exec/s: 0 rss: 71Mb L: 31/31 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:59.619 [2024-04-24 19:14:46.390492] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.390546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.390665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.390697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.390804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.390828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.390932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.390955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.619 #5 NEW cov: 11777 ft: 12209 corp: 3/63b lim: 35 exec/s: 0 rss: 71Mb L: 31/31 MS: 1 ChangeBinInt- 00:07:59.619 [2024-04-24 19:14:46.450690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.450723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.450835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.450852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.450951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.450969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.451064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.451082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.619 #6 NEW cov: 11783 ft: 12485 corp: 4/95b lim: 35 exec/s: 0 rss: 71Mb L: 32/32 MS: 1 CrossOver- 00:07:59.619 [2024-04-24 19:14:46.510258] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.510288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.510392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.510410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.619 #7 NEW cov: 11868 ft: 12990 corp: 5/109b lim: 35 exec/s: 0 rss: 72Mb L: 14/32 MS: 1 CrossOver- 00:07:59.619 [2024-04-24 19:14:46.570151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000bf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.570180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.619 #11 NEW cov: 11868 ft: 13791 corp: 6/120b lim: 35 exec/s: 0 rss: 72Mb L: 11/32 MS: 4 CrossOver-ChangeBit-InsertByte-CrossOver- 00:07:59.619 [2024-04-24 19:14:46.621123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.621152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.621250] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.621267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.619 [2024-04-24 19:14:46.621367] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.619 [2024-04-24 19:14:46.621382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.878 #12 NEW cov: 11868 ft: 14011 corp: 7/145b lim: 35 exec/s: 0 rss: 72Mb L: 25/32 MS: 1 EraseBytes- 00:07:59.878 [2024-04-24 19:14:46.671487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.671516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.671619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.671637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.671738] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.671754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.878 #13 NEW cov: 11868 ft: 14124 corp: 8/167b lim: 35 exec/s: 0 rss: 72Mb L: 22/32 MS: 1 EraseBytes- 00:07:59.878 [2024-04-24 19:14:46.722125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.722155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.722250] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.722268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.722375] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.722392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.722481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.722499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.878 #14 NEW cov: 11868 ft: 14147 corp: 9/201b lim: 35 exec/s: 0 rss: 72Mb L: 34/34 MS: 1 CMP- DE: "\001\000"- 00:07:59.878 [2024-04-24 19:14:46.771592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.771622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.771720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.771739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.878 #15 NEW cov: 11868 ft: 14213 corp: 10/217b lim: 35 exec/s: 0 rss: 72Mb L: 16/34 MS: 1 CMP- DE: "\377\377"- 00:07:59.878 [2024-04-24 19:14:46.832779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.832810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.832921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.832942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.833041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.833063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.878 [2024-04-24 19:14:46.833160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.833182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.878 #16 NEW cov: 11868 ft: 14279 corp: 11/248b lim: 35 exec/s: 0 rss: 72Mb L: 31/34 MS: 1 ChangeByte- 00:07:59.878 [2024-04-24 19:14:46.881779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000bf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.878 [2024-04-24 19:14:46.881806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.137 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.137 #17 NEW cov: 11891 ft: 14324 corp: 12/259b lim: 35 exec/s: 0 rss: 72Mb L: 11/34 MS: 1 CrossOver- 00:08:00.137 [2024-04-24 19:14:46.942172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000bf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:46.942201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.137 #20 NEW cov: 11891 ft: 14355 corp: 13/267b lim: 35 exec/s: 0 rss: 72Mb L: 8/34 MS: 3 CrossOver-InsertByte-CrossOver- 00:08:00.137 [2024-04-24 19:14:46.993308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:46.993340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.137 [2024-04-24 19:14:46.993458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:46.993476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.137 [2024-04-24 19:14:46.993566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:46.993583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.137 #21 NEW cov: 11891 ft: 14371 corp: 14/294b lim: 35 exec/s: 21 rss: 72Mb L: 27/34 MS: 1 EraseBytes- 00:08:00.137 [2024-04-24 19:14:47.052962] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000bf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:47.052992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.137 #22 NEW cov: 11891 ft: 14472 corp: 15/302b lim: 35 exec/s: 22 rss: 72Mb L: 8/34 MS: 1 ChangeBit- 00:08:00.137 [2024-04-24 19:14:47.114065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:47.114097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.137 [2024-04-24 19:14:47.114207] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:47.114230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.137 [2024-04-24 19:14:47.114325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.137 [2024-04-24 19:14:47.114347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.137 #23 NEW cov: 11891 ft: 14489 corp: 16/329b lim: 35 exec/s: 23 rss: 72Mb L: 27/34 MS: 1 CopyPart- 00:08:00.395 [2024-04-24 19:14:47.174041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.174075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.174190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.174209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.395 #24 NEW cov: 11891 ft: 14499 corp: 17/343b lim: 35 exec/s: 24 rss: 72Mb L: 14/34 MS: 1 ShuffleBytes- 00:08:00.395 [2024-04-24 19:14:47.224917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.224947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.225043] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.225065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.225160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.225179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.395 #25 NEW cov: 11891 ft: 14513 corp: 18/367b lim: 35 exec/s: 25 rss: 73Mb L: 24/34 MS: 1 InsertRepeatedBytes- 00:08:00.395 [2024-04-24 19:14:47.285084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.285114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.285220] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.285240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.285335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.285352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.395 #26 NEW cov: 11891 ft: 14548 corp: 19/393b lim: 35 exec/s: 26 rss: 73Mb L: 26/34 MS: 1 InsertRepeatedBytes- 00:08:00.395 [2024-04-24 19:14:47.335521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.335554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.335643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.335667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.335762] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.335780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.395 #27 NEW cov: 11891 ft: 14557 corp: 20/415b lim: 35 exec/s: 27 rss: 73Mb L: 22/34 MS: 1 ChangeByte- 00:08:00.395 [2024-04-24 19:14:47.395452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.395482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.395 [2024-04-24 19:14:47.395590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.395 [2024-04-24 19:14:47.395611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.653 #33 NEW cov: 11891 ft: 14616 corp: 21/429b lim: 35 exec/s: 33 rss: 73Mb L: 14/34 MS: 1 ChangeBinInt- 00:08:00.653 [2024-04-24 19:14:47.446021] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.446052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.446167] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.446188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.446298] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.446316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.653 #34 NEW cov: 11891 ft: 14635 corp: 22/456b lim: 35 exec/s: 34 rss: 73Mb L: 27/34 MS: 1 ChangeBinInt- 00:08:00.653 [2024-04-24 19:14:47.505634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.505661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.653 #35 NEW cov: 11898 ft: 14667 corp: 23/464b lim: 35 exec/s: 35 rss: 73Mb L: 8/34 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:00.653 [2024-04-24 19:14:47.556959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.556989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.557083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.557115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.557214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.557232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.557322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.557340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.653 #36 NEW cov: 11898 ft: 14712 corp: 24/496b lim: 35 exec/s: 36 rss: 73Mb L: 32/34 MS: 1 ChangeByte- 00:08:00.653 [2024-04-24 19:14:47.607447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.607476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.607571] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.607589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.607688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.607704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.607792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.607812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.653 #37 NEW cov: 11898 ft: 14715 corp: 25/527b lim: 35 exec/s: 37 rss: 73Mb L: 31/34 MS: 1 CrossOver- 00:08:00.653 [2024-04-24 19:14:47.657350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.657380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.657487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.657505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.653 [2024-04-24 19:14:47.657598] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.653 [2024-04-24 19:14:47.657615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.912 #38 NEW cov: 11898 ft: 14725 corp: 26/548b lim: 35 exec/s: 38 rss: 73Mb L: 21/34 MS: 1 EraseBytes- 00:08:00.912 [2024-04-24 19:14:47.717503] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.717534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.912 [2024-04-24 19:14:47.717624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.717644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.912 #39 NEW cov: 11898 ft: 14732 corp: 27/566b lim: 35 exec/s: 39 rss: 73Mb L: 18/34 MS: 1 EraseBytes- 00:08:00.912 [2024-04-24 19:14:47.767325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000bf SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.767354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.912 #45 NEW cov: 11898 ft: 14747 corp: 28/574b lim: 35 exec/s: 45 rss: 73Mb L: 8/34 MS: 1 ChangeBinInt- 00:08:00.912 [2024-04-24 19:14:47.828861] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.828890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.912 [2024-04-24 19:14:47.828978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.828997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.912 [2024-04-24 19:14:47.829102] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.829120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.912 [2024-04-24 19:14:47.829204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000f7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.829223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.912 #46 NEW cov: 11898 ft: 14768 corp: 29/608b lim: 35 exec/s: 46 rss: 73Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:00.912 [2024-04-24 19:14:47.878436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.878470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.912 [2024-04-24 19:14:47.878573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.912 [2024-04-24 19:14:47.878593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.912 #47 NEW cov: 11898 ft: 14819 corp: 30/622b lim: 35 exec/s: 47 rss: 73Mb L: 14/34 MS: 1 ShuffleBytes- 00:08:01.171 [2024-04-24 19:14:47.938912] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.171 [2024-04-24 19:14:47.938941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.171 [2024-04-24 19:14:47.939049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.171 [2024-04-24 19:14:47.939074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.171 #48 NEW cov: 11898 ft: 14864 corp: 31/636b lim: 35 exec/s: 48 rss: 73Mb L: 14/34 MS: 1 ChangeBinInt- 00:08:01.171 [2024-04-24 19:14:47.989866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.171 [2024-04-24 19:14:47.989896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.171 [2024-04-24 19:14:47.990008] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.171 [2024-04-24 19:14:47.990026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.171 [2024-04-24 19:14:47.990118] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.171 [2024-04-24 19:14:47.990136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.171 [2024-04-24 19:14:47.990239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.171 [2024-04-24 19:14:47.990256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.171 #49 NEW cov: 11898 ft: 14886 corp: 32/664b lim: 35 exec/s: 24 rss: 73Mb L: 28/34 MS: 1 InsertRepeatedBytes- 00:08:01.171 #49 DONE cov: 11898 ft: 14886 corp: 32/664b lim: 35 exec/s: 24 rss: 73Mb 00:08:01.171 ###### Recommended dictionary. ###### 00:08:01.171 "\001\000" # Uses: 0 00:08:01.171 "\377\377" # Uses: 0 00:08:01.171 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:01.171 ###### End of recommended dictionary. ###### 00:08:01.171 Done 49 runs in 2 second(s) 00:08:01.171 19:14:48 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:01.171 19:14:48 -- ../common.sh@72 -- # (( i++ )) 00:08:01.171 19:14:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:01.171 19:14:48 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:01.171 19:14:48 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:01.171 19:14:48 -- nvmf/run.sh@24 -- # local timen=1 00:08:01.171 19:14:48 -- nvmf/run.sh@25 -- # local core=0x1 00:08:01.171 19:14:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:01.171 19:14:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:01.171 19:14:48 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:01.171 19:14:48 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:01.171 19:14:48 -- nvmf/run.sh@34 -- # printf %02d 15 00:08:01.171 19:14:48 -- nvmf/run.sh@34 -- # port=4415 00:08:01.171 19:14:48 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:01.171 19:14:48 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:01.171 19:14:48 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:01.171 19:14:48 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:01.171 19:14:48 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:01.171 19:14:48 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:01.430 [2024-04-24 19:14:48.194501] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:01.430 [2024-04-24 19:14:48.194582] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625150 ] 00:08:01.430 EAL: No free 2048 kB hugepages reported on node 1 00:08:01.689 [2024-04-24 19:14:48.502915] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.689 [2024-04-24 19:14:48.589545] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.689 [2024-04-24 19:14:48.648884] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.689 [2024-04-24 19:14:48.665093] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:01.689 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.689 INFO: Seed: 3697759645 00:08:01.689 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:01.689 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:01.689 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:01.689 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.689 #2 INITED exec/s: 0 rss: 64Mb 00:08:01.689 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.689 This may also happen if the target rejected all inputs we tried so far 00:08:01.948 [2024-04-24 19:14:48.710804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.948 [2024-04-24 19:14:48.710835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.948 [2024-04-24 19:14:48.710898] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.948 [2024-04-24 19:14:48.710915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.948 [2024-04-24 19:14:48.710975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.948 [2024-04-24 19:14:48.710989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.948 [2024-04-24 19:14:48.711050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:01.948 [2024-04-24 19:14:48.711068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.207 NEW_FUNC[1/670]: 0x497310 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:02.207 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.207 #5 NEW cov: 11635 ft: 11636 corp: 2/34b lim: 35 exec/s: 0 rss: 71Mb L: 33/33 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:02.207 [2024-04-24 19:14:49.074124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.074185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.074308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.074329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.074445] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.074466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.074579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.074601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.207 #6 NEW cov: 11765 ft: 12098 corp: 3/68b lim: 35 exec/s: 0 rss: 71Mb L: 34/34 MS: 1 InsertByte- 00:08:02.207 [2024-04-24 19:14:49.134202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.134230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.134336] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.134353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.134452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.134469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.134573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.134589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.207 #7 NEW cov: 11771 ft: 12408 corp: 4/102b lim: 35 exec/s: 0 rss: 71Mb L: 34/34 MS: 1 ChangeASCIIInt- 00:08:02.207 [2024-04-24 19:14:49.194442] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.194473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.194574] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.194591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.194692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.194708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.207 [2024-04-24 19:14:49.194806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.207 [2024-04-24 19:14:49.194823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.207 #8 NEW cov: 11856 ft: 12641 corp: 5/133b lim: 35 exec/s: 0 rss: 71Mb L: 31/34 MS: 1 EraseBytes- 00:08:02.466 [2024-04-24 19:14:49.245092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.245122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.245230] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.245245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.245342] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.245358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.245455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.245471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.245574] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.245589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.466 #9 NEW cov: 11856 ft: 12817 corp: 6/168b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 InsertByte- 00:08:02.466 [2024-04-24 19:14:49.304892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.304920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.305020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.305038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.305145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.305164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.305257] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.305276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.466 #10 NEW cov: 11856 ft: 12897 corp: 7/201b lim: 35 exec/s: 0 rss: 72Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:02.466 [2024-04-24 19:14:49.355159] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.355189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.355305] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.355323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.355427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.355443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.355545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.355562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.466 #11 NEW cov: 11856 ft: 12948 corp: 8/233b lim: 35 exec/s: 0 rss: 72Mb L: 32/35 MS: 1 InsertByte- 00:08:02.466 [2024-04-24 19:14:49.415323] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.415351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.415449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.415465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.415564] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.415580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.466 [2024-04-24 19:14:49.415669] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.466 [2024-04-24 19:14:49.415685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.466 #12 NEW cov: 11856 ft: 12976 corp: 9/267b lim: 35 exec/s: 0 rss: 72Mb L: 34/35 MS: 1 ChangeByte- 00:08:02.466 [2024-04-24 19:14:49.464583] ctrlr.c:1743:nvmf_ctrlr_get_features_host_identifier: *ERROR*: Get Features - Host Identifier with EXHID=0 not allowed 00:08:02.467 [2024-04-24 19:14:49.465127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.467 [2024-04-24 19:14:49.465157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.467 [2024-04-24 19:14:49.465264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.467 [2024-04-24 19:14:49.465280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.467 [2024-04-24 19:14:49.465370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST IDENTIFIER cid:6 cdw10:00000081 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.467 [2024-04-24 19:14:49.465387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.726 NEW_FUNC[1/1]: 0x1164b60 in nvmf_ctrlr_get_features_host_identifier /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1733 00:08:02.726 #13 NEW cov: 11880 ft: 13452 corp: 10/293b lim: 35 exec/s: 0 rss: 72Mb L: 26/35 MS: 1 EraseBytes- 00:08:02.726 [2024-04-24 19:14:49.525443] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.525474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.525584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.525599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.525711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.525727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.726 #14 NEW cov: 11880 ft: 13503 corp: 11/319b lim: 35 exec/s: 0 rss: 72Mb L: 26/35 MS: 1 ShuffleBytes- 00:08:02.726 [2024-04-24 19:14:49.585868] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.585899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.586005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.586022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.586155] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.586173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.586263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.586279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.726 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:02.726 #15 NEW cov: 11903 ft: 13548 corp: 12/353b lim: 35 exec/s: 0 rss: 72Mb L: 34/35 MS: 1 CrossOver- 00:08:02.726 [2024-04-24 19:14:49.646101] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.646131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.646244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.646261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.646365] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.646381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.646498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.646515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.726 #16 NEW cov: 11903 ft: 13566 corp: 13/387b lim: 35 exec/s: 0 rss: 72Mb L: 34/35 MS: 1 ChangeByte- 00:08:02.726 [2024-04-24 19:14:49.696451] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.696479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.696595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.696610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.696717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.696733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.726 [2024-04-24 19:14:49.696841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.726 [2024-04-24 19:14:49.696856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.726 #17 NEW cov: 11903 ft: 13642 corp: 14/421b lim: 35 exec/s: 17 rss: 72Mb L: 34/35 MS: 1 InsertByte- 00:08:02.985 [2024-04-24 19:14:49.746971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.747000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.747113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.747130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.747231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.747247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.747345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.747362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.985 #18 NEW cov: 11903 ft: 13675 corp: 15/454b lim: 35 exec/s: 18 rss: 72Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:02.985 [2024-04-24 19:14:49.796271] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000041 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.796300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.796399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.796416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.985 #23 NEW cov: 11903 ft: 13966 corp: 16/469b lim: 35 exec/s: 23 rss: 72Mb L: 15/35 MS: 5 InsertByte-CMP-ChangeByte-ChangeBit-CrossOver- DE: "\007\000\000\000"- 00:08:02.985 [2024-04-24 19:14:49.847051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.847083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.847196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.847214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.847322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.847339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.847446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.847465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.985 #29 NEW cov: 11903 ft: 13985 corp: 17/502b lim: 35 exec/s: 29 rss: 72Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:02.985 [2024-04-24 19:14:49.907499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.907530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.907644] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.907662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.907766] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.907784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.907887] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.907906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.985 #30 NEW cov: 11903 ft: 14036 corp: 18/533b lim: 35 exec/s: 30 rss: 72Mb L: 31/35 MS: 1 ChangeBit- 00:08:02.985 [2024-04-24 19:14:49.956868] ctrlr.c:1743:nvmf_ctrlr_get_features_host_identifier: *ERROR*: Get Features - Host Identifier with EXHID=0 not allowed 00:08:02.985 [2024-04-24 19:14:49.957420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.957453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.957565] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.957583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.985 [2024-04-24 19:14:49.957680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST IDENTIFIER cid:6 cdw10:00000081 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.985 [2024-04-24 19:14:49.957698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.985 #31 NEW cov: 11903 ft: 14088 corp: 19/559b lim: 35 exec/s: 31 rss: 72Mb L: 26/35 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:08:03.243 [2024-04-24 19:14:50.007997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.008028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.008130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.008153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.008260] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.008277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.008382] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.008399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.244 #32 NEW cov: 11903 ft: 14129 corp: 20/593b lim: 35 exec/s: 32 rss: 72Mb L: 34/35 MS: 1 ChangeBinInt- 00:08:03.244 [2024-04-24 19:14:50.057772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.057802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.057907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.057923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.244 #33 NEW cov: 11903 ft: 14155 corp: 21/612b lim: 35 exec/s: 33 rss: 72Mb L: 19/35 MS: 1 CrossOver- 00:08:03.244 [2024-04-24 19:14:50.108741] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.108770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.108878] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.108895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.108999] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.109017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.109123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.109150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.244 #34 NEW cov: 11903 ft: 14171 corp: 22/646b lim: 35 exec/s: 34 rss: 72Mb L: 34/35 MS: 1 ShuffleBytes- 00:08:03.244 [2024-04-24 19:14:50.159051] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.159082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.159182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.159210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.159313] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.159328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.159426] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.159442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.244 #35 NEW cov: 11903 ft: 14188 corp: 23/678b lim: 35 exec/s: 35 rss: 72Mb L: 32/35 MS: 1 InsertByte- 00:08:03.244 [2024-04-24 19:14:50.219330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.219358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.219464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.219480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.219578] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.219593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.244 [2024-04-24 19:14:50.219698] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.244 [2024-04-24 19:14:50.219713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.244 #36 NEW cov: 11903 ft: 14229 corp: 24/710b lim: 35 exec/s: 36 rss: 73Mb L: 32/35 MS: 1 EraseBytes- 00:08:03.502 [2024-04-24 19:14:50.279607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.279636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.502 [2024-04-24 19:14:50.279765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.279783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.502 [2024-04-24 19:14:50.279883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.279901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.502 #37 NEW cov: 11903 ft: 14262 corp: 25/732b lim: 35 exec/s: 37 rss: 73Mb L: 22/35 MS: 1 EraseBytes- 00:08:03.502 [2024-04-24 19:14:50.330385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.330414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.502 [2024-04-24 19:14:50.330523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.330538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.502 [2024-04-24 19:14:50.330638] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.330655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.502 [2024-04-24 19:14:50.330754] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.330773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.502 #38 NEW cov: 11903 ft: 14271 corp: 26/766b lim: 35 exec/s: 38 rss: 73Mb L: 34/35 MS: 1 ChangeASCIIInt- 00:08:03.502 [2024-04-24 19:14:50.391191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.391218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.502 [2024-04-24 19:14:50.391317] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.502 [2024-04-24 19:14:50.391334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.502 [2024-04-24 19:14:50.391433] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.391449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.391554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.391571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.391666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.391684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.503 #39 NEW cov: 11903 ft: 14320 corp: 27/801b lim: 35 exec/s: 39 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:08:03.503 [2024-04-24 19:14:50.441179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.441208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.441311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.441336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.441432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.441447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.441550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.441566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.503 #40 NEW cov: 11903 ft: 14326 corp: 28/830b lim: 35 exec/s: 40 rss: 73Mb L: 29/35 MS: 1 EraseBytes- 00:08:03.503 [2024-04-24 19:14:50.491622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.491650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.491753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.491769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.491876] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.491893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.492003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.492017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.503 [2024-04-24 19:14:50.492126] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.503 [2024-04-24 19:14:50.492142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.762 #41 NEW cov: 11903 ft: 14341 corp: 29/865b lim: 35 exec/s: 41 rss: 73Mb L: 35/35 MS: 1 ChangeBit- 00:08:03.762 [2024-04-24 19:14:50.551684] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.551711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.551819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.551837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.551934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.551950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.552053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.552073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.762 #42 NEW cov: 11903 ft: 14375 corp: 30/898b lim: 35 exec/s: 42 rss: 73Mb L: 33/35 MS: 1 ChangeByte- 00:08:03.762 [2024-04-24 19:14:50.601149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.601176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.601279] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.601296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.762 #43 NEW cov: 11903 ft: 14403 corp: 31/913b lim: 35 exec/s: 43 rss: 73Mb L: 15/35 MS: 1 EraseBytes- 00:08:03.762 [2024-04-24 19:14:50.662591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.662620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.662726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.662743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.662847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.662863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.662956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.662976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.663081] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000400 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.663097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.762 #44 NEW cov: 11903 ft: 14417 corp: 32/948b lim: 35 exec/s: 44 rss: 73Mb L: 35/35 MS: 1 CopyPart- 00:08:03.762 [2024-04-24 19:14:50.722481] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000091 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.722509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.722615] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.722631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.722740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.722755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.762 [2024-04-24 19:14:50.722865] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.762 [2024-04-24 19:14:50.722880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.762 #45 NEW cov: 11903 ft: 14478 corp: 33/982b lim: 35 exec/s: 22 rss: 73Mb L: 34/35 MS: 1 ShuffleBytes- 00:08:03.762 #45 DONE cov: 11903 ft: 14478 corp: 33/982b lim: 35 exec/s: 22 rss: 73Mb 00:08:03.762 ###### Recommended dictionary. ###### 00:08:03.762 "\007\000\000\000" # Uses: 1 00:08:03.762 ###### End of recommended dictionary. ###### 00:08:03.762 Done 45 runs in 2 second(s) 00:08:04.021 19:14:50 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:04.021 19:14:50 -- ../common.sh@72 -- # (( i++ )) 00:08:04.021 19:14:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.021 19:14:50 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:04.021 19:14:50 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:04.021 19:14:50 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.021 19:14:50 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.021 19:14:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:04.021 19:14:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:04.021 19:14:50 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:04.021 19:14:50 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:04.021 19:14:50 -- nvmf/run.sh@34 -- # printf %02d 16 00:08:04.021 19:14:50 -- nvmf/run.sh@34 -- # port=4416 00:08:04.021 19:14:50 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:04.021 19:14:50 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:04.021 19:14:50 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.021 19:14:50 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:04.021 19:14:50 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:04.021 19:14:50 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:04.021 [2024-04-24 19:14:50.932606] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:04.021 [2024-04-24 19:14:50.932682] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625513 ] 00:08:04.021 EAL: No free 2048 kB hugepages reported on node 1 00:08:04.279 [2024-04-24 19:14:51.248054] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.537 [2024-04-24 19:14:51.335670] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.537 [2024-04-24 19:14:51.395030] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.537 [2024-04-24 19:14:51.411246] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:04.537 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.537 INFO: Seed: 2147774850 00:08:04.537 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:04.537 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:04.537 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:04.537 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.537 #2 INITED exec/s: 0 rss: 64Mb 00:08:04.537 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.537 This may also happen if the target rejected all inputs we tried so far 00:08:04.537 [2024-04-24 19:14:51.460372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.537 [2024-04-24 19:14:51.460402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:04.795 NEW_FUNC[1/671]: 0x4987c0 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:04.795 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:04.795 #4 NEW cov: 11739 ft: 11733 corp: 2/41b lim: 105 exec/s: 0 rss: 71Mb L: 40/40 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:04.795 [2024-04-24 19:14:51.791094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.795 [2024-04-24 19:14:51.791137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.054 #5 NEW cov: 11869 ft: 12346 corp: 3/81b lim: 105 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeByte- 00:08:05.054 [2024-04-24 19:14:51.841176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407135578037661 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:51.841207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.054 #6 NEW cov: 11875 ft: 12512 corp: 4/121b lim: 105 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 CopyPart- 00:08:05.054 [2024-04-24 19:14:51.881253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:51.881281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.054 #12 NEW cov: 11960 ft: 12862 corp: 5/161b lim: 105 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:05.054 [2024-04-24 19:14:51.921402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357477501855964573 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:51.921431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.054 #13 NEW cov: 11960 ft: 12934 corp: 6/201b lim: 105 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeBit- 00:08:05.054 [2024-04-24 19:14:51.961475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786803 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:51.961505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.054 #14 NEW cov: 11960 ft: 13056 corp: 7/241b lim: 105 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 ChangeByte- 00:08:05.054 [2024-04-24 19:14:52.001652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:52.001685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.054 #15 NEW cov: 11960 ft: 13112 corp: 8/263b lim: 105 exec/s: 0 rss: 72Mb L: 22/40 MS: 1 EraseBytes- 00:08:05.054 [2024-04-24 19:14:52.041957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786803 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:52.041982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.054 [2024-04-24 19:14:52.042035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11357407135578037661 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:52.042052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.054 [2024-04-24 19:14:52.042114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11357407135578062749 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.054 [2024-04-24 19:14:52.042129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.313 #16 NEW cov: 11960 ft: 13650 corp: 9/335b lim: 105 exec/s: 0 rss: 72Mb L: 72/72 MS: 1 CopyPart- 00:08:05.313 [2024-04-24 19:14:52.091874] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.091900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.313 #17 NEW cov: 11960 ft: 13683 corp: 10/375b lim: 105 exec/s: 0 rss: 72Mb L: 40/72 MS: 1 ShuffleBytes- 00:08:05.313 [2024-04-24 19:14:52.131991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.132017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.313 #18 NEW cov: 11960 ft: 13723 corp: 11/397b lim: 105 exec/s: 0 rss: 72Mb L: 22/72 MS: 1 CrossOver- 00:08:05.313 [2024-04-24 19:14:52.172063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133648657821 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.172090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.313 #19 NEW cov: 11960 ft: 13814 corp: 12/437b lim: 105 exec/s: 0 rss: 72Mb L: 40/72 MS: 1 ChangeBit- 00:08:05.313 [2024-04-24 19:14:52.212552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2821266740684990247 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.212579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.313 [2024-04-24 19:14:52.212630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2821266740684990247 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.212646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.313 [2024-04-24 19:14:52.212697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2821266740684990247 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.212716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.313 [2024-04-24 19:14:52.212768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2821266740684990247 len:10024 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.212783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:05.313 #20 NEW cov: 11960 ft: 14312 corp: 13/523b lim: 105 exec/s: 0 rss: 72Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:08:05.313 [2024-04-24 19:14:52.252348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.313 [2024-04-24 19:14:52.252374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.313 #21 NEW cov: 11960 ft: 14370 corp: 14/545b lim: 105 exec/s: 0 rss: 72Mb L: 22/86 MS: 1 ShuffleBytes- 00:08:05.314 [2024-04-24 19:14:52.292438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.314 [2024-04-24 19:14:52.292465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.314 #22 NEW cov: 11960 ft: 14385 corp: 15/585b lim: 105 exec/s: 0 rss: 73Mb L: 40/86 MS: 1 ShuffleBytes- 00:08:05.605 [2024-04-24 19:14:52.332527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.332555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.605 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:05.605 #23 NEW cov: 11983 ft: 14466 corp: 16/625b lim: 105 exec/s: 0 rss: 73Mb L: 40/86 MS: 1 ChangeBinInt- 00:08:05.605 [2024-04-24 19:14:52.372637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786803 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.372664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.605 #24 NEW cov: 11983 ft: 14523 corp: 17/662b lim: 105 exec/s: 0 rss: 73Mb L: 37/86 MS: 1 EraseBytes- 00:08:05.605 [2024-04-24 19:14:52.412889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8970181431921507452 len:31869 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.412915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.605 [2024-04-24 19:14:52.412953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:8970181431921507452 len:31869 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.412968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.605 #29 NEW cov: 11983 ft: 14795 corp: 18/712b lim: 105 exec/s: 0 rss: 73Mb L: 50/86 MS: 5 EraseBytes-CrossOver-ChangeBinInt-EraseBytes-InsertRepeatedBytes- 00:08:05.605 [2024-04-24 19:14:52.452839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786803 len:25187 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.452866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.605 #30 NEW cov: 11983 ft: 14833 corp: 19/752b lim: 105 exec/s: 30 rss: 73Mb L: 40/86 MS: 1 ChangeBinInt- 00:08:05.605 [2024-04-24 19:14:52.492972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357406733679828275 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.492998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.605 #36 NEW cov: 11983 ft: 14858 corp: 20/793b lim: 105 exec/s: 36 rss: 73Mb L: 41/86 MS: 1 InsertByte- 00:08:05.605 [2024-04-24 19:14:52.533084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.533110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.605 #37 NEW cov: 11983 ft: 14894 corp: 21/816b lim: 105 exec/s: 37 rss: 73Mb L: 23/86 MS: 1 InsertByte- 00:08:05.605 [2024-04-24 19:14:52.573222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357406736313851187 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.573249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.605 #38 NEW cov: 11983 ft: 14955 corp: 22/857b lim: 105 exec/s: 38 rss: 73Mb L: 41/86 MS: 1 ChangeByte- 00:08:05.605 [2024-04-24 19:14:52.613327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:8970181430008904828 len:31869 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.605 [2024-04-24 19:14:52.613353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.864 #39 NEW cov: 11983 ft: 14974 corp: 23/897b lim: 105 exec/s: 39 rss: 73Mb L: 40/86 MS: 1 CrossOver- 00:08:05.864 [2024-04-24 19:14:52.653468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407134008122781 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.864 [2024-04-24 19:14:52.653495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.864 #40 NEW cov: 11983 ft: 15001 corp: 24/938b lim: 105 exec/s: 40 rss: 73Mb L: 41/86 MS: 1 InsertByte- 00:08:05.864 [2024-04-24 19:14:52.693537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133115981213 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.864 [2024-04-24 19:14:52.693564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.864 #41 NEW cov: 11983 ft: 15003 corp: 25/960b lim: 105 exec/s: 41 rss: 73Mb L: 22/86 MS: 1 ChangeByte- 00:08:05.865 [2024-04-24 19:14:52.733640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.865 [2024-04-24 19:14:52.733667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.865 #42 NEW cov: 11983 ft: 15036 corp: 26/982b lim: 105 exec/s: 42 rss: 73Mb L: 22/86 MS: 1 ShuffleBytes- 00:08:05.865 [2024-04-24 19:14:52.773762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65422 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.865 [2024-04-24 19:14:52.773789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.865 #43 NEW cov: 11983 ft: 15044 corp: 27/1004b lim: 105 exec/s: 43 rss: 73Mb L: 22/86 MS: 1 ChangeBit- 00:08:05.865 [2024-04-24 19:14:52.813978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.865 [2024-04-24 19:14:52.814004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.865 [2024-04-24 19:14:52.814042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11313334113922293149 len:8210 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.865 [2024-04-24 19:14:52.814062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.865 #44 NEW cov: 11983 ft: 15048 corp: 28/1052b lim: 105 exec/s: 44 rss: 73Mb L: 48/86 MS: 1 CMP- DE: "\001\011o\373\253 \021\266"- 00:08:05.865 [2024-04-24 19:14:52.853992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.865 [2024-04-24 19:14:52.854018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.865 #45 NEW cov: 11983 ft: 15058 corp: 29/1075b lim: 105 exec/s: 45 rss: 73Mb L: 23/86 MS: 1 InsertByte- 00:08:06.124 [2024-04-24 19:14:52.894207] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:764940809536380317 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:52.894233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.124 [2024-04-24 19:14:52.894270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:764940812002631069 len:40448 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:52.894287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.124 #46 NEW cov: 11983 ft: 15065 corp: 30/1119b lim: 105 exec/s: 46 rss: 73Mb L: 44/86 MS: 1 CopyPart- 00:08:06.124 [2024-04-24 19:14:52.934355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13889313184910721216 len:49345 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:52.934381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.124 [2024-04-24 19:14:52.934417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:13889313184910721216 len:49345 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:52.934433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.124 #47 NEW cov: 11983 ft: 15075 corp: 31/1175b lim: 105 exec/s: 47 rss: 73Mb L: 56/86 MS: 1 InsertRepeatedBytes- 00:08:06.124 [2024-04-24 19:14:52.974594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:52.974620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.124 [2024-04-24 19:14:52.974665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11357407135578037661 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:52.974681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.124 [2024-04-24 19:14:52.974736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11357407135578037661 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:52.974751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.124 #48 NEW cov: 11983 ft: 15085 corp: 32/1248b lim: 105 exec/s: 48 rss: 73Mb L: 73/86 MS: 1 CrossOver- 00:08:06.124 [2024-04-24 19:14:53.014600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786803 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:53.014626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.124 [2024-04-24 19:14:53.014663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11357407135578037661 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:53.014678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.124 #49 NEW cov: 11983 ft: 15162 corp: 33/1304b lim: 105 exec/s: 49 rss: 73Mb L: 56/86 MS: 1 EraseBytes- 00:08:06.124 [2024-04-24 19:14:53.054573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65422 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:53.054599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.124 #50 NEW cov: 11983 ft: 15170 corp: 34/1326b lim: 105 exec/s: 50 rss: 73Mb L: 22/86 MS: 1 ChangeBinInt- 00:08:06.124 [2024-04-24 19:14:53.094681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:53.094708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.124 #51 NEW cov: 11983 ft: 15171 corp: 35/1349b lim: 105 exec/s: 51 rss: 73Mb L: 23/86 MS: 1 ChangeBinInt- 00:08:06.124 [2024-04-24 19:14:53.134819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357126079046065565 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.124 [2024-04-24 19:14:53.134845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.383 #52 NEW cov: 11983 ft: 15190 corp: 36/1385b lim: 105 exec/s: 52 rss: 74Mb L: 36/86 MS: 1 CrossOver- 00:08:06.383 [2024-04-24 19:14:53.174930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786803 len:25187 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-04-24 19:14:53.174956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.383 #53 NEW cov: 11983 ft: 15193 corp: 37/1425b lim: 105 exec/s: 53 rss: 74Mb L: 40/86 MS: 1 ShuffleBytes- 00:08:06.383 [2024-04-24 19:14:53.215067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-04-24 19:14:53.215095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.383 #54 NEW cov: 11983 ft: 15199 corp: 38/1465b lim: 105 exec/s: 54 rss: 74Mb L: 40/86 MS: 1 ChangeBinInt- 00:08:06.383 [2024-04-24 19:14:53.255285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786803 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-04-24 19:14:53.255312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.383 [2024-04-24 19:14:53.255382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11357407135578037661 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-04-24 19:14:53.255399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.383 #55 NEW cov: 11983 ft: 15207 corp: 39/1517b lim: 105 exec/s: 55 rss: 74Mb L: 52/86 MS: 1 CopyPart- 00:08:06.383 [2024-04-24 19:14:53.295288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3719301768293490077 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-04-24 19:14:53.295314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.383 #56 NEW cov: 11983 ft: 15219 corp: 40/1558b lim: 105 exec/s: 56 rss: 74Mb L: 41/86 MS: 1 ShuffleBytes- 00:08:06.383 [2024-04-24 19:14:53.335471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11357407133111786909 len:65438 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-04-24 19:14:53.335505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.383 #57 NEW cov: 11983 ft: 15243 corp: 41/1580b lim: 105 exec/s: 57 rss: 74Mb L: 22/86 MS: 1 CrossOver- 00:08:06.383 [2024-04-24 19:14:53.365508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3719302165081791901 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.383 [2024-04-24 19:14:53.365538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.383 #58 NEW cov: 11983 ft: 15273 corp: 42/1621b lim: 105 exec/s: 58 rss: 74Mb L: 41/86 MS: 1 CrossOver- 00:08:06.642 [2024-04-24 19:14:53.405852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3719301768293490077 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.642 [2024-04-24 19:14:53.405878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.642 [2024-04-24 19:14:53.405923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11357407135578039709 len:40414 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.642 [2024-04-24 19:14:53.405939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.642 [2024-04-24 19:14:53.406011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11357407135578037661 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.642 [2024-04-24 19:14:53.406028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.642 #59 NEW cov: 11983 ft: 15298 corp: 43/1696b lim: 105 exec/s: 59 rss: 74Mb L: 75/86 MS: 1 CrossOver- 00:08:06.642 [2024-04-24 19:14:53.446014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3719301768293490077 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.642 [2024-04-24 19:14:53.446040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.642 [2024-04-24 19:14:53.446099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11357407135578039709 len:40414 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.642 [2024-04-24 19:14:53.446116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.642 [2024-04-24 19:14:53.446170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11357407135578037661 len:40350 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.642 [2024-04-24 19:14:53.446185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.642 #60 NEW cov: 11983 ft: 15318 corp: 44/1771b lim: 105 exec/s: 30 rss: 74Mb L: 75/86 MS: 1 ChangeBinInt- 00:08:06.642 #60 DONE cov: 11983 ft: 15318 corp: 44/1771b lim: 105 exec/s: 30 rss: 74Mb 00:08:06.642 ###### Recommended dictionary. ###### 00:08:06.642 "\001\011o\373\253 \021\266" # Uses: 0 00:08:06.642 ###### End of recommended dictionary. ###### 00:08:06.642 Done 60 runs in 2 second(s) 00:08:06.642 19:14:53 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:06.642 19:14:53 -- ../common.sh@72 -- # (( i++ )) 00:08:06.642 19:14:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.642 19:14:53 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:06.642 19:14:53 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:06.642 19:14:53 -- nvmf/run.sh@24 -- # local timen=1 00:08:06.642 19:14:53 -- nvmf/run.sh@25 -- # local core=0x1 00:08:06.642 19:14:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:06.642 19:14:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:06.642 19:14:53 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:06.642 19:14:53 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:06.642 19:14:53 -- nvmf/run.sh@34 -- # printf %02d 17 00:08:06.642 19:14:53 -- nvmf/run.sh@34 -- # port=4417 00:08:06.642 19:14:53 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:06.642 19:14:53 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:06.642 19:14:53 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:06.642 19:14:53 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:06.642 19:14:53 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:06.642 19:14:53 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:06.642 [2024-04-24 19:14:53.649318] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:06.642 [2024-04-24 19:14:53.649400] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625872 ] 00:08:06.901 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.160 [2024-04-24 19:14:53.976066] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.160 [2024-04-24 19:14:54.065456] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.160 [2024-04-24 19:14:54.124890] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.160 [2024-04-24 19:14:54.141101] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:07.160 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.160 INFO: Seed: 581826335 00:08:07.160 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:07.160 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:07.160 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:07.160 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.160 #2 INITED exec/s: 0 rss: 64Mb 00:08:07.160 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.160 This may also happen if the target rejected all inputs we tried so far 00:08:07.418 [2024-04-24 19:14:54.189676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.419 [2024-04-24 19:14:54.189709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.419 [2024-04-24 19:14:54.189755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.419 [2024-04-24 19:14:54.189771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.419 [2024-04-24 19:14:54.189827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.419 [2024-04-24 19:14:54.189844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.677 NEW_FUNC[1/672]: 0x49bb40 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:07.677 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.677 #7 NEW cov: 11760 ft: 11761 corp: 2/89b lim: 120 exec/s: 0 rss: 71Mb L: 88/88 MS: 5 CrossOver-ChangeBit-ChangeBit-ChangeBinInt-InsertRepeatedBytes- 00:08:07.677 [2024-04-24 19:14:54.530418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.677 [2024-04-24 19:14:54.530459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.677 [2024-04-24 19:14:54.530513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.677 [2024-04-24 19:14:54.530533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.677 [2024-04-24 19:14:54.530584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.677 [2024-04-24 19:14:54.530598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.677 #8 NEW cov: 11890 ft: 12267 corp: 3/172b lim: 120 exec/s: 0 rss: 71Mb L: 83/88 MS: 1 EraseBytes- 00:08:07.677 [2024-04-24 19:14:54.580460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.677 [2024-04-24 19:14:54.580488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.677 [2024-04-24 19:14:54.580538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.677 [2024-04-24 19:14:54.580555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.678 [2024-04-24 19:14:54.580607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.580623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.678 #9 NEW cov: 11896 ft: 12568 corp: 4/260b lim: 120 exec/s: 0 rss: 72Mb L: 88/88 MS: 1 CopyPart- 00:08:07.678 [2024-04-24 19:14:54.620553] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.620579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.678 [2024-04-24 19:14:54.620613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.620630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.678 [2024-04-24 19:14:54.620683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.620698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.678 #10 NEW cov: 11981 ft: 12802 corp: 5/344b lim: 120 exec/s: 0 rss: 72Mb L: 84/88 MS: 1 CrossOver- 00:08:07.678 [2024-04-24 19:14:54.670855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.670882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.678 [2024-04-24 19:14:54.670927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.670943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.678 [2024-04-24 19:14:54.670992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.671008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.678 [2024-04-24 19:14:54.671065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.678 [2024-04-24 19:14:54.671084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.678 #11 NEW cov: 11981 ft: 13256 corp: 6/450b lim: 120 exec/s: 0 rss: 72Mb L: 106/106 MS: 1 CrossOver- 00:08:07.937 [2024-04-24 19:14:54.710935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070404440063 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.710961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.711012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.711028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.711085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.711117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.711169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.711192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.937 #16 NEW cov: 11981 ft: 13314 corp: 7/562b lim: 120 exec/s: 0 rss: 72Mb L: 112/112 MS: 5 ChangeByte-ShuffleBytes-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:07.937 [2024-04-24 19:14:54.751088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.751114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.751161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.751176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.751225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.751240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.751288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.751303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.937 #17 NEW cov: 11981 ft: 13382 corp: 8/668b lim: 120 exec/s: 0 rss: 72Mb L: 106/112 MS: 1 ShuffleBytes- 00:08:07.937 [2024-04-24 19:14:54.801187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.801213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.801261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.801278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.801326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.801360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.801412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.801428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.937 #18 NEW cov: 11981 ft: 13432 corp: 9/786b lim: 120 exec/s: 0 rss: 72Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:07.937 [2024-04-24 19:14:54.850931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538884809155041 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.850957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.937 #21 NEW cov: 11981 ft: 14353 corp: 10/821b lim: 120 exec/s: 0 rss: 72Mb L: 35/118 MS: 3 CopyPart-ChangeBinInt-CrossOver- 00:08:07.937 [2024-04-24 19:14:54.891480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.891506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.891552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.891568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.891619] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.891634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.891686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.891701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.937 #22 NEW cov: 11981 ft: 14392 corp: 11/927b lim: 120 exec/s: 0 rss: 72Mb L: 106/118 MS: 1 ChangeBit- 00:08:07.937 [2024-04-24 19:14:54.931586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070404440063 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.931612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.931658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.931674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.931724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.931740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.937 [2024-04-24 19:14:54.931793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.937 [2024-04-24 19:14:54.931809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.197 #23 NEW cov: 11981 ft: 14438 corp: 12/1039b lim: 120 exec/s: 0 rss: 72Mb L: 112/118 MS: 1 ChangeBit- 00:08:08.197 [2024-04-24 19:14:54.981708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.197 [2024-04-24 19:14:54.981735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.197 [2024-04-24 19:14:54.981799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.197 [2024-04-24 19:14:54.981814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.197 [2024-04-24 19:14:54.981865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.197 [2024-04-24 19:14:54.981881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.197 [2024-04-24 19:14:54.981932] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.197 [2024-04-24 19:14:54.981948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.197 #24 NEW cov: 11981 ft: 14500 corp: 13/1143b lim: 120 exec/s: 0 rss: 72Mb L: 104/118 MS: 1 CopyPart- 00:08:08.197 [2024-04-24 19:14:55.021390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538884809155041 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.021416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.198 #25 NEW cov: 11981 ft: 14554 corp: 14/1178b lim: 120 exec/s: 0 rss: 72Mb L: 35/118 MS: 1 ShuffleBytes- 00:08:08.198 [2024-04-24 19:14:55.071942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.071968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.072020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.072036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.072104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.072121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.072172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.072188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.198 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.198 #26 NEW cov: 12004 ft: 14615 corp: 15/1284b lim: 120 exec/s: 0 rss: 72Mb L: 106/118 MS: 1 CrossOver- 00:08:08.198 [2024-04-24 19:14:55.111926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743944355242495 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.111952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.112010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.112029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.112084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.112100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.198 #27 NEW cov: 12004 ft: 14628 corp: 16/1376b lim: 120 exec/s: 0 rss: 72Mb L: 92/118 MS: 1 InsertRepeatedBytes- 00:08:08.198 [2024-04-24 19:14:55.152233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.152260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.152325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.152340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.152392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16212958662323462625 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.152408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.152460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.152475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.198 #33 NEW cov: 12004 ft: 14663 corp: 17/1485b lim: 120 exec/s: 0 rss: 72Mb L: 109/118 MS: 1 InsertRepeatedBytes- 00:08:08.198 [2024-04-24 19:14:55.192137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743944355242495 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.192167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.192215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538914337055201 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.192230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.198 [2024-04-24 19:14:55.192283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.198 [2024-04-24 19:14:55.192298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.469 #34 NEW cov: 12004 ft: 14720 corp: 18/1577b lim: 120 exec/s: 34 rss: 72Mb L: 92/118 MS: 1 ChangeBinInt- 00:08:08.469 [2024-04-24 19:14:55.232347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.232374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.469 [2024-04-24 19:14:55.232410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.232426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.469 [2024-04-24 19:14:55.232482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.232498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.469 #35 NEW cov: 12004 ft: 14733 corp: 19/1665b lim: 120 exec/s: 35 rss: 73Mb L: 88/118 MS: 1 CopyPart- 00:08:08.469 [2024-04-24 19:14:55.272412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.272438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.469 [2024-04-24 19:14:55.272477] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.272492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.469 [2024-04-24 19:14:55.272543] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.272559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.469 #36 NEW cov: 12004 ft: 14752 corp: 20/1753b lim: 120 exec/s: 36 rss: 73Mb L: 88/118 MS: 1 CopyPart- 00:08:08.469 [2024-04-24 19:14:55.312529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743944355242495 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.312555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.469 [2024-04-24 19:14:55.312606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.312622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.469 [2024-04-24 19:14:55.312673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.469 [2024-04-24 19:14:55.312690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.470 #37 NEW cov: 12004 ft: 14770 corp: 21/1831b lim: 120 exec/s: 37 rss: 73Mb L: 78/118 MS: 1 EraseBytes- 00:08:08.470 [2024-04-24 19:14:55.352744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57674 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.352772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.352817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.352833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.352884] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.352918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.352971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.352987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.470 #38 NEW cov: 12004 ft: 14782 corp: 22/1938b lim: 120 exec/s: 38 rss: 73Mb L: 107/118 MS: 1 InsertByte- 00:08:08.470 [2024-04-24 19:14:55.402825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.402851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.402885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.402901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.402953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.402970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.470 #39 NEW cov: 12004 ft: 14785 corp: 23/2026b lim: 120 exec/s: 39 rss: 73Mb L: 88/118 MS: 1 ChangeBinInt- 00:08:08.470 [2024-04-24 19:14:55.443033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.443065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.443116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.443132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.443185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.443200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.470 [2024-04-24 19:14:55.443253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.470 [2024-04-24 19:14:55.443268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.470 #40 NEW cov: 12004 ft: 14803 corp: 24/2130b lim: 120 exec/s: 40 rss: 73Mb L: 104/118 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:08.763 [2024-04-24 19:14:55.492920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743944355242495 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.492948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.492993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.493010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.763 #41 NEW cov: 12004 ft: 15123 corp: 25/2178b lim: 120 exec/s: 41 rss: 73Mb L: 48/118 MS: 1 CrossOver- 00:08:08.763 [2024-04-24 19:14:55.533320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070404440063 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.533348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.533394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.533413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.533463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.533478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.533529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.533544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.763 #47 NEW cov: 12004 ft: 15169 corp: 26/2291b lim: 120 exec/s: 47 rss: 73Mb L: 113/118 MS: 1 InsertByte- 00:08:08.763 [2024-04-24 19:14:55.583021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538884809155041 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.583047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.763 #48 NEW cov: 12004 ft: 15184 corp: 27/2315b lim: 120 exec/s: 48 rss: 73Mb L: 24/118 MS: 1 EraseBytes- 00:08:08.763 [2024-04-24 19:14:55.633335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.633361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.633398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.633413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.763 #49 NEW cov: 12004 ft: 15193 corp: 28/2384b lim: 120 exec/s: 49 rss: 73Mb L: 69/118 MS: 1 EraseBytes- 00:08:08.763 [2024-04-24 19:14:55.673714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.673740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.673804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.673821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.673872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.673887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.673939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.673953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.763 #50 NEW cov: 12004 ft: 15205 corp: 29/2489b lim: 120 exec/s: 50 rss: 73Mb L: 105/118 MS: 1 InsertByte- 00:08:08.763 [2024-04-24 19:14:55.723715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.723741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.723779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.723793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.723845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.723861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.763 #51 NEW cov: 12004 ft: 15221 corp: 30/2573b lim: 120 exec/s: 51 rss: 73Mb L: 84/118 MS: 1 ChangeBit- 00:08:08.763 [2024-04-24 19:14:55.763701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743944355242495 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.763727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.763 [2024-04-24 19:14:55.763763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.763 [2024-04-24 19:14:55.763779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.042 #52 NEW cov: 12004 ft: 15241 corp: 31/2621b lim: 120 exec/s: 52 rss: 73Mb L: 48/118 MS: 1 ChangeBinInt- 00:08:09.042 [2024-04-24 19:14:55.814165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.814194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.814252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.814268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.814319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.814334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.814385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.814401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.042 #53 NEW cov: 12004 ft: 15259 corp: 32/2738b lim: 120 exec/s: 53 rss: 74Mb L: 117/118 MS: 1 InsertRepeatedBytes- 00:08:09.042 [2024-04-24 19:14:55.864245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.864272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.864320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16281042488194621921 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.864337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.864389] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.864404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.864457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.864473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.904351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.904377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.904427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16281042488194621921 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.904442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.904508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.904525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.904577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.904592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.042 #55 NEW cov: 12004 ft: 15275 corp: 33/2842b lim: 120 exec/s: 55 rss: 74Mb L: 104/118 MS: 2 ChangeBit-CopyPart- 00:08:09.042 [2024-04-24 19:14:55.944202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743944355242495 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.944227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.042 [2024-04-24 19:14:55.944264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.042 [2024-04-24 19:14:55.944279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.043 #56 NEW cov: 12004 ft: 15330 corp: 34/2890b lim: 120 exec/s: 56 rss: 74Mb L: 48/118 MS: 1 ChangeBinInt- 00:08:09.043 [2024-04-24 19:14:55.984490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.043 [2024-04-24 19:14:55.984515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.043 [2024-04-24 19:14:55.984559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.043 [2024-04-24 19:14:55.984575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.043 [2024-04-24 19:14:55.984624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.043 [2024-04-24 19:14:55.984640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.043 #57 NEW cov: 12004 ft: 15335 corp: 35/2978b lim: 120 exec/s: 57 rss: 74Mb L: 88/118 MS: 1 ChangeBit- 00:08:09.043 [2024-04-24 19:14:56.024763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.043 [2024-04-24 19:14:56.024790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.043 [2024-04-24 19:14:56.024832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.043 [2024-04-24 19:14:56.024845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.043 [2024-04-24 19:14:56.024894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.043 [2024-04-24 19:14:56.024909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.043 [2024-04-24 19:14:56.024957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.043 [2024-04-24 19:14:56.024971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.043 #58 NEW cov: 12004 ft: 15340 corp: 36/3082b lim: 120 exec/s: 58 rss: 74Mb L: 104/118 MS: 1 CMP- DE: "\021\000\000\000"- 00:08:09.308 [2024-04-24 19:14:56.064411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:966399222033 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.308 [2024-04-24 19:14:56.064440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.308 #59 NEW cov: 12004 ft: 15420 corp: 37/3106b lim: 120 exec/s: 59 rss: 74Mb L: 24/118 MS: 1 PersAutoDict- DE: "\021\000\000\000"- 00:08:09.308 [2024-04-24 19:14:56.114992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57674 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.308 [2024-04-24 19:14:56.115018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.308 [2024-04-24 19:14:56.115072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57611 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.308 [2024-04-24 19:14:56.115088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.308 [2024-04-24 19:14:56.115137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.308 [2024-04-24 19:14:56.115153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.308 [2024-04-24 19:14:56.115204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.308 [2024-04-24 19:14:56.115218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.308 #60 NEW cov: 12004 ft: 15432 corp: 38/3213b lim: 120 exec/s: 60 rss: 74Mb L: 107/118 MS: 1 CrossOver- 00:08:09.308 [2024-04-24 19:14:56.154723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538884809155041 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.308 [2024-04-24 19:14:56.154750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.308 #61 NEW cov: 12004 ft: 15452 corp: 39/3237b lim: 120 exec/s: 30 rss: 74Mb L: 24/118 MS: 1 CrossOver- 00:08:09.308 #61 DONE cov: 12004 ft: 15452 corp: 39/3237b lim: 120 exec/s: 30 rss: 74Mb 00:08:09.308 ###### Recommended dictionary. ###### 00:08:09.308 "\001\000\000\000" # Uses: 0 00:08:09.308 "\021\000\000\000" # Uses: 1 00:08:09.308 ###### End of recommended dictionary. ###### 00:08:09.308 Done 61 runs in 2 second(s) 00:08:09.308 19:14:56 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:09.308 19:14:56 -- ../common.sh@72 -- # (( i++ )) 00:08:09.308 19:14:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.308 19:14:56 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:09.308 19:14:56 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:09.308 19:14:56 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.308 19:14:56 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.308 19:14:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:09.308 19:14:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:09.308 19:14:56 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:09.308 19:14:56 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:09.308 19:14:56 -- nvmf/run.sh@34 -- # printf %02d 18 00:08:09.308 19:14:56 -- nvmf/run.sh@34 -- # port=4418 00:08:09.308 19:14:56 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:09.567 19:14:56 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:09.567 19:14:56 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.567 19:14:56 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:09.567 19:14:56 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:09.567 19:14:56 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:09.567 [2024-04-24 19:14:56.358185] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:09.567 [2024-04-24 19:14:56.358278] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626233 ] 00:08:09.567 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.825 [2024-04-24 19:14:56.678917] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.826 [2024-04-24 19:14:56.771664] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.826 [2024-04-24 19:14:56.831183] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.085 [2024-04-24 19:14:56.847385] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:10.085 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.085 INFO: Seed: 3288818879 00:08:10.085 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:10.085 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:10.085 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:10.085 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.085 #2 INITED exec/s: 0 rss: 64Mb 00:08:10.085 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.085 This may also happen if the target rejected all inputs we tried so far 00:08:10.085 [2024-04-24 19:14:56.896272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.085 [2024-04-24 19:14:56.896302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.344 NEW_FUNC[1/670]: 0x49f430 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:10.344 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.344 #12 NEW cov: 11703 ft: 11704 corp: 2/22b lim: 100 exec/s: 0 rss: 71Mb L: 21/21 MS: 5 CopyPart-CopyPart-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:10.344 [2024-04-24 19:14:57.227084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.344 [2024-04-24 19:14:57.227125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.344 #13 NEW cov: 11833 ft: 12226 corp: 3/43b lim: 100 exec/s: 0 rss: 71Mb L: 21/21 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:10.344 [2024-04-24 19:14:57.277256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.344 [2024-04-24 19:14:57.277283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.344 [2024-04-24 19:14:57.277349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.344 [2024-04-24 19:14:57.277364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.344 #14 NEW cov: 11839 ft: 12802 corp: 4/94b lim: 100 exec/s: 0 rss: 72Mb L: 51/51 MS: 1 InsertRepeatedBytes- 00:08:10.344 [2024-04-24 19:14:57.317368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.344 [2024-04-24 19:14:57.317394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.344 [2024-04-24 19:14:57.317427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.344 [2024-04-24 19:14:57.317443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.344 #15 NEW cov: 11924 ft: 12965 corp: 5/145b lim: 100 exec/s: 0 rss: 72Mb L: 51/51 MS: 1 ChangeBit- 00:08:10.603 [2024-04-24 19:14:57.367474] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.603 [2024-04-24 19:14:57.367498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.603 #16 NEW cov: 11924 ft: 13108 corp: 6/166b lim: 100 exec/s: 0 rss: 72Mb L: 21/51 MS: 1 CopyPart- 00:08:10.603 [2024-04-24 19:14:57.407701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.603 [2024-04-24 19:14:57.407726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.603 [2024-04-24 19:14:57.407771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.603 [2024-04-24 19:14:57.407786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.603 [2024-04-24 19:14:57.407837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:10.603 [2024-04-24 19:14:57.407852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.603 #17 NEW cov: 11924 ft: 13441 corp: 7/245b lim: 100 exec/s: 0 rss: 72Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:10.603 [2024-04-24 19:14:57.447617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.603 [2024-04-24 19:14:57.447642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.603 #18 NEW cov: 11924 ft: 13552 corp: 8/266b lim: 100 exec/s: 0 rss: 72Mb L: 21/79 MS: 1 CopyPart- 00:08:10.603 [2024-04-24 19:14:57.487727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.603 [2024-04-24 19:14:57.487753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.603 #19 NEW cov: 11924 ft: 13575 corp: 9/287b lim: 100 exec/s: 0 rss: 72Mb L: 21/79 MS: 1 ChangeBinInt- 00:08:10.603 [2024-04-24 19:14:57.528012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.603 [2024-04-24 19:14:57.528037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.603 [2024-04-24 19:14:57.528088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.603 [2024-04-24 19:14:57.528107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.603 [2024-04-24 19:14:57.528158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:10.603 [2024-04-24 19:14:57.528173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.603 #20 NEW cov: 11924 ft: 13620 corp: 10/347b lim: 100 exec/s: 0 rss: 72Mb L: 60/79 MS: 1 InsertRepeatedBytes- 00:08:10.603 [2024-04-24 19:14:57.567947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.603 [2024-04-24 19:14:57.567972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.603 #21 NEW cov: 11924 ft: 13656 corp: 11/368b lim: 100 exec/s: 0 rss: 72Mb L: 21/79 MS: 1 CrossOver- 00:08:10.603 [2024-04-24 19:14:57.608184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.603 [2024-04-24 19:14:57.608208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.603 [2024-04-24 19:14:57.608247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.603 [2024-04-24 19:14:57.608261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.861 #25 NEW cov: 11924 ft: 13690 corp: 12/419b lim: 100 exec/s: 0 rss: 72Mb L: 51/79 MS: 4 ChangeByte-CMP-ChangeByte-CrossOver- DE: "\004\000"- 00:08:10.861 [2024-04-24 19:14:57.648191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.861 [2024-04-24 19:14:57.648216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.861 #26 NEW cov: 11924 ft: 13775 corp: 13/441b lim: 100 exec/s: 0 rss: 72Mb L: 22/79 MS: 1 InsertByte- 00:08:10.861 [2024-04-24 19:14:57.688292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.861 [2024-04-24 19:14:57.688317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.861 #27 NEW cov: 11924 ft: 13793 corp: 14/463b lim: 100 exec/s: 0 rss: 72Mb L: 22/79 MS: 1 CrossOver- 00:08:10.861 [2024-04-24 19:14:57.728407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.861 [2024-04-24 19:14:57.728433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.861 #28 NEW cov: 11924 ft: 13818 corp: 15/484b lim: 100 exec/s: 0 rss: 72Mb L: 21/79 MS: 1 ChangeBit- 00:08:10.861 [2024-04-24 19:14:57.758450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.861 [2024-04-24 19:14:57.758474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.861 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:10.861 #29 NEW cov: 11947 ft: 13896 corp: 16/519b lim: 100 exec/s: 0 rss: 72Mb L: 35/79 MS: 1 InsertRepeatedBytes- 00:08:10.861 [2024-04-24 19:14:57.808854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.861 [2024-04-24 19:14:57.808879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.861 [2024-04-24 19:14:57.808929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.862 [2024-04-24 19:14:57.808944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.862 [2024-04-24 19:14:57.808994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:10.862 [2024-04-24 19:14:57.809010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.862 #30 NEW cov: 11947 ft: 13924 corp: 17/598b lim: 100 exec/s: 0 rss: 73Mb L: 79/79 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:10.862 [2024-04-24 19:14:57.848744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.862 [2024-04-24 19:14:57.848770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.862 #31 NEW cov: 11947 ft: 13953 corp: 18/619b lim: 100 exec/s: 0 rss: 73Mb L: 21/79 MS: 1 ChangeBinInt- 00:08:11.125 [2024-04-24 19:14:57.888929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.125 [2024-04-24 19:14:57.888954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:57.888990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.125 [2024-04-24 19:14:57.889004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.125 #32 NEW cov: 11947 ft: 13965 corp: 19/661b lim: 100 exec/s: 32 rss: 73Mb L: 42/79 MS: 1 CrossOver- 00:08:11.125 [2024-04-24 19:14:57.929051] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.125 [2024-04-24 19:14:57.929082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:57.929118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.125 [2024-04-24 19:14:57.929133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.125 #38 NEW cov: 11947 ft: 14095 corp: 20/712b lim: 100 exec/s: 38 rss: 73Mb L: 51/79 MS: 1 ShuffleBytes- 00:08:11.125 [2024-04-24 19:14:57.969041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.125 [2024-04-24 19:14:57.969070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.125 #39 NEW cov: 11947 ft: 14145 corp: 21/747b lim: 100 exec/s: 39 rss: 73Mb L: 35/79 MS: 1 CrossOver- 00:08:11.125 [2024-04-24 19:14:58.009427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.125 [2024-04-24 19:14:58.009452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:58.009500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.125 [2024-04-24 19:14:58.009515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:58.009583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.125 [2024-04-24 19:14:58.009597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.125 #40 NEW cov: 11947 ft: 14193 corp: 22/815b lim: 100 exec/s: 40 rss: 73Mb L: 68/79 MS: 1 InsertRepeatedBytes- 00:08:11.125 [2024-04-24 19:14:58.049337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.125 [2024-04-24 19:14:58.049362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.125 #41 NEW cov: 11947 ft: 14230 corp: 23/840b lim: 100 exec/s: 41 rss: 73Mb L: 25/79 MS: 1 CopyPart- 00:08:11.125 [2024-04-24 19:14:58.089796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.125 [2024-04-24 19:14:58.089821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:58.089868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.125 [2024-04-24 19:14:58.089886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:58.089935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.125 [2024-04-24 19:14:58.089949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:58.089999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.125 [2024-04-24 19:14:58.090013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.125 #42 NEW cov: 11947 ft: 14516 corp: 24/936b lim: 100 exec/s: 42 rss: 73Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:11.125 [2024-04-24 19:14:58.139815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.125 [2024-04-24 19:14:58.139840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:58.139885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.125 [2024-04-24 19:14:58.139899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.125 [2024-04-24 19:14:58.139951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.125 [2024-04-24 19:14:58.139965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.384 #48 NEW cov: 11947 ft: 14561 corp: 25/1015b lim: 100 exec/s: 48 rss: 73Mb L: 79/96 MS: 1 ShuffleBytes- 00:08:11.384 [2024-04-24 19:14:58.179822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.384 [2024-04-24 19:14:58.179848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.384 [2024-04-24 19:14:58.179884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.384 [2024-04-24 19:14:58.179900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.384 #49 NEW cov: 11947 ft: 14611 corp: 26/1066b lim: 100 exec/s: 49 rss: 73Mb L: 51/96 MS: 1 ChangeBinInt- 00:08:11.384 [2024-04-24 19:14:58.219890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.384 [2024-04-24 19:14:58.219916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.384 [2024-04-24 19:14:58.219982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.384 [2024-04-24 19:14:58.219998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.384 #50 NEW cov: 11947 ft: 14617 corp: 27/1108b lim: 100 exec/s: 50 rss: 73Mb L: 42/96 MS: 1 CrossOver- 00:08:11.384 [2024-04-24 19:14:58.260269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.384 [2024-04-24 19:14:58.260295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.384 [2024-04-24 19:14:58.260346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.384 [2024-04-24 19:14:58.260359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.384 [2024-04-24 19:14:58.260423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.384 [2024-04-24 19:14:58.260438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.384 [2024-04-24 19:14:58.260490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.384 [2024-04-24 19:14:58.260504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.384 #51 NEW cov: 11947 ft: 14636 corp: 28/1205b lim: 100 exec/s: 51 rss: 73Mb L: 97/97 MS: 1 InsertRepeatedBytes- 00:08:11.384 [2024-04-24 19:14:58.299981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.384 [2024-04-24 19:14:58.300007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.384 #52 NEW cov: 11947 ft: 14656 corp: 29/1231b lim: 100 exec/s: 52 rss: 73Mb L: 26/97 MS: 1 InsertByte- 00:08:11.384 [2024-04-24 19:14:58.340133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.384 [2024-04-24 19:14:58.340158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.384 #53 NEW cov: 11947 ft: 14680 corp: 30/1252b lim: 100 exec/s: 53 rss: 73Mb L: 21/97 MS: 1 ChangeBinInt- 00:08:11.384 [2024-04-24 19:14:58.380283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.384 [2024-04-24 19:14:58.380309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 #54 NEW cov: 11947 ft: 14702 corp: 31/1279b lim: 100 exec/s: 54 rss: 74Mb L: 27/97 MS: 1 InsertByte- 00:08:11.643 [2024-04-24 19:14:58.420571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.643 [2024-04-24 19:14:58.420596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 [2024-04-24 19:14:58.420648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.643 [2024-04-24 19:14:58.420662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.643 [2024-04-24 19:14:58.420715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.643 [2024-04-24 19:14:58.420730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.643 #55 NEW cov: 11947 ft: 14711 corp: 32/1339b lim: 100 exec/s: 55 rss: 74Mb L: 60/97 MS: 1 ChangeBinInt- 00:08:11.643 [2024-04-24 19:14:58.460596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.643 [2024-04-24 19:14:58.460621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 [2024-04-24 19:14:58.460658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.643 [2024-04-24 19:14:58.460672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.643 #56 NEW cov: 11947 ft: 14714 corp: 33/1390b lim: 100 exec/s: 56 rss: 74Mb L: 51/97 MS: 1 CopyPart- 00:08:11.643 [2024-04-24 19:14:58.500602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.643 [2024-04-24 19:14:58.500627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 #57 NEW cov: 11947 ft: 14734 corp: 34/1411b lim: 100 exec/s: 57 rss: 74Mb L: 21/97 MS: 1 ShuffleBytes- 00:08:11.643 [2024-04-24 19:14:58.530651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.643 [2024-04-24 19:14:58.530676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 #58 NEW cov: 11947 ft: 14741 corp: 35/1449b lim: 100 exec/s: 58 rss: 74Mb L: 38/97 MS: 1 CrossOver- 00:08:11.643 [2024-04-24 19:14:58.570801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.643 [2024-04-24 19:14:58.570829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 #62 NEW cov: 11947 ft: 14775 corp: 36/1475b lim: 100 exec/s: 62 rss: 74Mb L: 26/97 MS: 4 EraseBytes-ShuffleBytes-EraseBytes-CMP- DE: "W\334\032\340\226\177\000\000"- 00:08:11.643 [2024-04-24 19:14:58.611053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.643 [2024-04-24 19:14:58.611084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 [2024-04-24 19:14:58.611149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.643 [2024-04-24 19:14:58.611164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.643 [2024-04-24 19:14:58.611218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.643 [2024-04-24 19:14:58.611232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.643 #63 NEW cov: 11947 ft: 14784 corp: 37/1537b lim: 100 exec/s: 63 rss: 74Mb L: 62/97 MS: 1 InsertRepeatedBytes- 00:08:11.643 [2024-04-24 19:14:58.651106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.643 [2024-04-24 19:14:58.651132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.643 [2024-04-24 19:14:58.651169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.643 [2024-04-24 19:14:58.651184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.903 #64 NEW cov: 11947 ft: 14798 corp: 38/1592b lim: 100 exec/s: 64 rss: 74Mb L: 55/97 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:11.903 [2024-04-24 19:14:58.691329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.903 [2024-04-24 19:14:58.691354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.903 [2024-04-24 19:14:58.691415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.903 [2024-04-24 19:14:58.691431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.903 [2024-04-24 19:14:58.691485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.903 [2024-04-24 19:14:58.691500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.903 #65 NEW cov: 11947 ft: 14801 corp: 39/1671b lim: 100 exec/s: 65 rss: 74Mb L: 79/97 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:11.903 [2024-04-24 19:14:58.731578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.903 [2024-04-24 19:14:58.731604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.903 [2024-04-24 19:14:58.731669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.903 [2024-04-24 19:14:58.731684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.903 [2024-04-24 19:14:58.731735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.903 [2024-04-24 19:14:58.731750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.903 [2024-04-24 19:14:58.731801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.903 [2024-04-24 19:14:58.731815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.903 #66 NEW cov: 11947 ft: 14809 corp: 40/1758b lim: 100 exec/s: 66 rss: 74Mb L: 87/97 MS: 1 CrossOver- 00:08:11.903 [2024-04-24 19:14:58.771383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.903 [2024-04-24 19:14:58.771408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.903 #67 NEW cov: 11947 ft: 14835 corp: 41/1794b lim: 100 exec/s: 67 rss: 74Mb L: 36/97 MS: 1 CrossOver- 00:08:11.903 [2024-04-24 19:14:58.811482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.903 [2024-04-24 19:14:58.811506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.903 #68 NEW cov: 11947 ft: 14853 corp: 42/1815b lim: 100 exec/s: 68 rss: 74Mb L: 21/97 MS: 1 CopyPart- 00:08:11.903 [2024-04-24 19:14:58.851718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.903 [2024-04-24 19:14:58.851744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.903 [2024-04-24 19:14:58.851791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.903 [2024-04-24 19:14:58.851807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.903 #69 NEW cov: 11947 ft: 14864 corp: 43/1866b lim: 100 exec/s: 34 rss: 74Mb L: 51/97 MS: 1 CopyPart- 00:08:11.903 #69 DONE cov: 11947 ft: 14864 corp: 43/1866b lim: 100 exec/s: 34 rss: 74Mb 00:08:11.903 ###### Recommended dictionary. ###### 00:08:11.903 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:11.903 "\004\000" # Uses: 2 00:08:11.903 "W\334\032\340\226\177\000\000" # Uses: 0 00:08:11.903 "\000\000\000\000" # Uses: 0 00:08:11.903 ###### End of recommended dictionary. ###### 00:08:11.903 Done 69 runs in 2 second(s) 00:08:12.162 19:14:59 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:12.162 19:14:59 -- ../common.sh@72 -- # (( i++ )) 00:08:12.162 19:14:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.162 19:14:59 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:12.162 19:14:59 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:12.162 19:14:59 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.162 19:14:59 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.162 19:14:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:12.162 19:14:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:12.162 19:14:59 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:12.162 19:14:59 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:12.162 19:14:59 -- nvmf/run.sh@34 -- # printf %02d 19 00:08:12.162 19:14:59 -- nvmf/run.sh@34 -- # port=4419 00:08:12.162 19:14:59 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:12.163 19:14:59 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:12.163 19:14:59 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.163 19:14:59 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:12.163 19:14:59 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:12.163 19:14:59 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:12.163 [2024-04-24 19:14:59.062439] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:12.163 [2024-04-24 19:14:59.062516] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626592 ] 00:08:12.163 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.421 [2024-04-24 19:14:59.383373] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.679 [2024-04-24 19:14:59.469809] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.679 [2024-04-24 19:14:59.529414] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.679 [2024-04-24 19:14:59.545615] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:12.679 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.679 INFO: Seed: 1692837185 00:08:12.679 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:12.679 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:12.679 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:12.679 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.679 #2 INITED exec/s: 0 rss: 64Mb 00:08:12.679 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.679 This may also happen if the target rejected all inputs we tried so far 00:08:12.679 [2024-04-24 19:14:59.590348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:12.679 [2024-04-24 19:14:59.590384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.679 [2024-04-24 19:14:59.590420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:12.679 [2024-04-24 19:14:59.590438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.938 NEW_FUNC[1/670]: 0x4a23f0 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:12.938 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:12.938 #4 NEW cov: 11681 ft: 11682 corp: 2/26b lim: 50 exec/s: 0 rss: 71Mb L: 25/25 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:12.938 [2024-04-24 19:14:59.931190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:12.938 [2024-04-24 19:14:59.931240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.938 [2024-04-24 19:14:59.931291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:12.938 [2024-04-24 19:14:59.931309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.938 [2024-04-24 19:14:59.931337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914838024376868060 len:56541 00:08:12.938 [2024-04-24 19:14:59.931355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.197 #5 NEW cov: 11811 ft: 12470 corp: 3/63b lim: 50 exec/s: 0 rss: 71Mb L: 37/37 MS: 1 CopyPart- 00:08:13.197 [2024-04-24 19:15:00.001222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838024376868060 len:56541 00:08:13.197 [2024-04-24 19:15:00.001259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.197 #6 NEW cov: 11817 ft: 12996 corp: 4/82b lim: 50 exec/s: 0 rss: 72Mb L: 19/37 MS: 1 EraseBytes- 00:08:13.197 [2024-04-24 19:15:00.071404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.197 [2024-04-24 19:15:00.071443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.197 [2024-04-24 19:15:00.071498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.197 [2024-04-24 19:15:00.071517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.197 #11 NEW cov: 11902 ft: 13346 corp: 5/108b lim: 50 exec/s: 0 rss: 72Mb L: 26/37 MS: 5 ChangeBit-ShuffleBytes-CMP-ShuffleBytes-InsertRepeatedBytes- DE: "\000\000"- 00:08:13.197 [2024-04-24 19:15:00.131618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.197 [2024-04-24 19:15:00.131654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.197 [2024-04-24 19:15:00.131704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:13.197 [2024-04-24 19:15:00.131723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.197 #12 NEW cov: 11902 ft: 13418 corp: 6/133b lim: 50 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 ShuffleBytes- 00:08:13.197 [2024-04-24 19:15:00.191699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.197 [2024-04-24 19:15:00.191732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.197 [2024-04-24 19:15:00.191766] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56321 00:08:13.197 [2024-04-24 19:15:00.191784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.456 #13 NEW cov: 11902 ft: 13554 corp: 7/158b lim: 50 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:13.456 [2024-04-24 19:15:00.261911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.456 [2024-04-24 19:15:00.261943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.456 [2024-04-24 19:15:00.261976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:13.456 [2024-04-24 19:15:00.261995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.456 #14 NEW cov: 11902 ft: 13612 corp: 8/183b lim: 50 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 ShuffleBytes- 00:08:13.456 [2024-04-24 19:15:00.312001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.456 [2024-04-24 19:15:00.312033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.456 [2024-04-24 19:15:00.312074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56321 00:08:13.456 [2024-04-24 19:15:00.312110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.456 #15 NEW cov: 11902 ft: 13694 corp: 9/208b lim: 50 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 ShuffleBytes- 00:08:13.456 [2024-04-24 19:15:00.382245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.456 [2024-04-24 19:15:00.382275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.456 [2024-04-24 19:15:00.382327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:13.456 [2024-04-24 19:15:00.382347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.456 #16 NEW cov: 11902 ft: 13725 corp: 10/233b lim: 50 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 ShuffleBytes- 00:08:13.456 [2024-04-24 19:15:00.432332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.456 [2024-04-24 19:15:00.432365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.456 #17 NEW cov: 11902 ft: 13870 corp: 11/252b lim: 50 exec/s: 0 rss: 72Mb L: 19/37 MS: 1 EraseBytes- 00:08:13.715 [2024-04-24 19:15:00.483002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.715 [2024-04-24 19:15:00.483033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.715 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.715 #18 NEW cov: 11925 ft: 13938 corp: 12/268b lim: 50 exec/s: 0 rss: 72Mb L: 16/37 MS: 1 EraseBytes- 00:08:13.715 [2024-04-24 19:15:00.533267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.715 [2024-04-24 19:15:00.533296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.715 [2024-04-24 19:15:00.533369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56321 00:08:13.715 [2024-04-24 19:15:00.533387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.715 #19 NEW cov: 11925 ft: 14079 corp: 13/293b lim: 50 exec/s: 0 rss: 72Mb L: 25/37 MS: 1 CopyPart- 00:08:13.715 [2024-04-24 19:15:00.573347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.715 [2024-04-24 19:15:00.573376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.715 [2024-04-24 19:15:00.573421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:62167336032721920 len:56541 00:08:13.715 [2024-04-24 19:15:00.573439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.715 #20 NEW cov: 11925 ft: 14140 corp: 14/318b lim: 50 exec/s: 20 rss: 72Mb L: 25/37 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:13.715 [2024-04-24 19:15:00.613681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:13.715 [2024-04-24 19:15:00.613708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.715 [2024-04-24 19:15:00.613747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:13.715 [2024-04-24 19:15:00.613760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.715 [2024-04-24 19:15:00.613812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:6877 00:08:13.715 [2024-04-24 19:15:00.613828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.715 [2024-04-24 19:15:00.613882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56541 00:08:13.715 [2024-04-24 19:15:00.613897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.715 #21 NEW cov: 11925 ft: 14412 corp: 15/365b lim: 50 exec/s: 21 rss: 72Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:13.715 [2024-04-24 19:15:00.663620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.715 [2024-04-24 19:15:00.663652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.715 [2024-04-24 19:15:00.663725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:13.715 [2024-04-24 19:15:00.663743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.715 #22 NEW cov: 11925 ft: 14504 corp: 16/387b lim: 50 exec/s: 22 rss: 72Mb L: 22/47 MS: 1 EraseBytes- 00:08:13.715 [2024-04-24 19:15:00.703644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838024376868060 len:56541 00:08:13.715 [2024-04-24 19:15:00.703675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.974 #23 NEW cov: 11925 ft: 14514 corp: 17/406b lim: 50 exec/s: 23 rss: 72Mb L: 19/47 MS: 1 ChangeBit- 00:08:13.974 [2024-04-24 19:15:00.743900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:8961 00:08:13.974 [2024-04-24 19:15:00.743930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.974 [2024-04-24 19:15:00.743985] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.974 [2024-04-24 19:15:00.744003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.974 #24 NEW cov: 11925 ft: 14540 corp: 18/433b lim: 50 exec/s: 24 rss: 72Mb L: 27/47 MS: 1 InsertByte- 00:08:13.974 [2024-04-24 19:15:00.793881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122219228 len:56541 00:08:13.974 [2024-04-24 19:15:00.793909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.974 #25 NEW cov: 11925 ft: 14560 corp: 19/452b lim: 50 exec/s: 25 rss: 72Mb L: 19/47 MS: 1 ChangeBit- 00:08:13.974 [2024-04-24 19:15:00.833719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.974 [2024-04-24 19:15:00.833748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.974 #26 NEW cov: 11925 ft: 14607 corp: 20/471b lim: 50 exec/s: 26 rss: 72Mb L: 19/47 MS: 1 EraseBytes- 00:08:13.974 [2024-04-24 19:15:00.874194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.974 [2024-04-24 19:15:00.874222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.974 [2024-04-24 19:15:00.874278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:13.974 [2024-04-24 19:15:00.874296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.974 #27 NEW cov: 11925 ft: 14654 corp: 21/496b lim: 50 exec/s: 27 rss: 72Mb L: 25/47 MS: 1 ShuffleBytes- 00:08:13.974 [2024-04-24 19:15:00.914228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:13.974 [2024-04-24 19:15:00.914255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.974 #28 NEW cov: 11925 ft: 14691 corp: 22/512b lim: 50 exec/s: 28 rss: 73Mb L: 16/47 MS: 1 CopyPart- 00:08:13.974 [2024-04-24 19:15:00.964401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:256 00:08:13.974 [2024-04-24 19:15:00.964429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.974 #29 NEW cov: 11925 ft: 14696 corp: 23/528b lim: 50 exec/s: 29 rss: 73Mb L: 16/47 MS: 1 CrossOver- 00:08:14.232 [2024-04-24 19:15:01.004713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17340673859090677657 len:1 00:08:14.232 [2024-04-24 19:15:01.004740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.232 [2024-04-24 19:15:01.004779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851624184872960 len:1 00:08:14.232 [2024-04-24 19:15:01.004794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.232 [2024-04-24 19:15:01.004850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:10995116277760 len:1 00:08:14.232 [2024-04-24 19:15:01.004866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.232 #32 NEW cov: 11925 ft: 14714 corp: 24/562b lim: 50 exec/s: 32 rss: 73Mb L: 34/47 MS: 3 CMP-EraseBytes-CrossOver- DE: "\001\000\177\231\360\026\246s"- 00:08:14.232 [2024-04-24 19:15:01.044622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:14.232 [2024-04-24 19:15:01.044651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.232 #33 NEW cov: 11925 ft: 14722 corp: 25/581b lim: 50 exec/s: 33 rss: 73Mb L: 19/47 MS: 1 ShuffleBytes- 00:08:14.232 [2024-04-24 19:15:01.084682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.232 [2024-04-24 19:15:01.084710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.232 #34 NEW cov: 11925 ft: 14758 corp: 26/598b lim: 50 exec/s: 34 rss: 73Mb L: 17/47 MS: 1 EraseBytes- 00:08:14.232 [2024-04-24 19:15:01.134986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56360 00:08:14.232 [2024-04-24 19:15:01.135016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.232 [2024-04-24 19:15:01.135078] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838021273083100 len:56541 00:08:14.232 [2024-04-24 19:15:01.135099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.232 #35 NEW cov: 11925 ft: 14820 corp: 27/623b lim: 50 exec/s: 35 rss: 73Mb L: 25/47 MS: 1 ChangeBinInt- 00:08:14.232 [2024-04-24 19:15:01.185134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:14.232 [2024-04-24 19:15:01.185162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.232 [2024-04-24 19:15:01.185202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838024376868060 len:56541 00:08:14.232 [2024-04-24 19:15:01.185220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.232 #36 NEW cov: 11925 ft: 14853 corp: 28/648b lim: 50 exec/s: 36 rss: 73Mb L: 25/47 MS: 1 CrossOver- 00:08:14.232 [2024-04-24 19:15:01.225117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914697283633732828 len:56541 00:08:14.232 [2024-04-24 19:15:01.225144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 #37 NEW cov: 11925 ft: 14865 corp: 29/667b lim: 50 exec/s: 37 rss: 73Mb L: 19/47 MS: 1 ChangeBit- 00:08:14.491 [2024-04-24 19:15:01.265229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:6877 00:08:14.491 [2024-04-24 19:15:01.265260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 #38 NEW cov: 11925 ft: 14871 corp: 30/686b lim: 50 exec/s: 38 rss: 73Mb L: 19/47 MS: 1 CrossOver- 00:08:14.491 [2024-04-24 19:15:01.305648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56576 00:08:14.491 [2024-04-24 19:15:01.305675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 [2024-04-24 19:15:01.305720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:14.491 [2024-04-24 19:15:01.305737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.491 [2024-04-24 19:15:01.305806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:56541 00:08:14.491 [2024-04-24 19:15:01.305822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.491 [2024-04-24 19:15:01.305877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:15914838024376868060 len:56541 00:08:14.491 [2024-04-24 19:15:01.305893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.491 #39 NEW cov: 11925 ft: 14910 corp: 31/730b lim: 50 exec/s: 39 rss: 73Mb L: 44/47 MS: 1 InsertRepeatedBytes- 00:08:14.491 [2024-04-24 19:15:01.345412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914696343035895004 len:2417 00:08:14.491 [2024-04-24 19:15:01.345440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.491 #40 NEW cov: 11925 ft: 14934 corp: 32/749b lim: 50 exec/s: 40 rss: 73Mb L: 19/47 MS: 1 CMP- DE: "\001\011p\000;9{\022"- 00:08:14.491 [2024-04-24 19:15:01.385771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.491 [2024-04-24 19:15:01.385797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.492 [2024-04-24 19:15:01.385835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:944892805120 len:56541 00:08:14.492 [2024-04-24 19:15:01.385851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.492 [2024-04-24 19:15:01.385906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914595186925919452 len:1 00:08:14.492 [2024-04-24 19:15:01.385922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.492 #41 NEW cov: 11925 ft: 14959 corp: 33/784b lim: 50 exec/s: 41 rss: 73Mb L: 35/47 MS: 1 CrossOver- 00:08:14.492 [2024-04-24 19:15:01.425772] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122088156 len:56541 00:08:14.492 [2024-04-24 19:15:01.425799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.492 [2024-04-24 19:15:01.425875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:62167336032721920 len:56321 00:08:14.492 [2024-04-24 19:15:01.425892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.492 #42 NEW cov: 11925 ft: 14979 corp: 34/809b lim: 50 exec/s: 42 rss: 73Mb L: 25/47 MS: 1 PersAutoDict- DE: "\000\000"- 00:08:14.492 [2024-04-24 19:15:01.465779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1935664777764068572 len:56541 00:08:14.492 [2024-04-24 19:15:01.465808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.492 #43 NEW cov: 11925 ft: 14983 corp: 35/821b lim: 50 exec/s: 43 rss: 73Mb L: 12/47 MS: 1 CrossOver- 00:08:14.492 [2024-04-24 19:15:01.506100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17340673859090677657 len:1 00:08:14.492 [2024-04-24 19:15:01.506128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.492 [2024-04-24 19:15:01.506167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:9851624184872960 len:1 00:08:14.492 [2024-04-24 19:15:01.506183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.492 [2024-04-24 19:15:01.506240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744069414584575 len:65536 00:08:14.492 [2024-04-24 19:15:01.506256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.751 #44 NEW cov: 11925 ft: 14995 corp: 36/855b lim: 50 exec/s: 44 rss: 74Mb L: 34/47 MS: 1 CrossOver- 00:08:14.751 [2024-04-24 19:15:01.546224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:450682076 len:1 00:08:14.751 [2024-04-24 19:15:01.546251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.751 [2024-04-24 19:15:01.546288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:15914838020685880540 len:56541 00:08:14.751 [2024-04-24 19:15:01.546304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.751 [2024-04-24 19:15:01.546359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:15914595186925952220 len:56541 00:08:14.751 [2024-04-24 19:15:01.546392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.751 #45 NEW cov: 11925 ft: 15034 corp: 37/887b lim: 50 exec/s: 45 rss: 74Mb L: 32/47 MS: 1 InsertRepeatedBytes- 00:08:14.751 [2024-04-24 19:15:01.596153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:15914838021122219228 len:56541 00:08:14.751 [2024-04-24 19:15:01.596180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.751 #46 NEW cov: 11925 ft: 15049 corp: 38/906b lim: 50 exec/s: 23 rss: 74Mb L: 19/47 MS: 1 ChangeBit- 00:08:14.751 #46 DONE cov: 11925 ft: 15049 corp: 38/906b lim: 50 exec/s: 23 rss: 74Mb 00:08:14.751 ###### Recommended dictionary. ###### 00:08:14.751 "\000\000" # Uses: 3 00:08:14.751 "\001\000\177\231\360\026\246s" # Uses: 0 00:08:14.751 "\001\011p\000;9{\022" # Uses: 0 00:08:14.751 ###### End of recommended dictionary. ###### 00:08:14.751 Done 46 runs in 2 second(s) 00:08:14.751 19:15:01 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:14.751 19:15:01 -- ../common.sh@72 -- # (( i++ )) 00:08:14.751 19:15:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.751 19:15:01 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:14.751 19:15:01 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:14.751 19:15:01 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.751 19:15:01 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.751 19:15:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:14.751 19:15:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:14.751 19:15:01 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:14.751 19:15:01 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:14.751 19:15:01 -- nvmf/run.sh@34 -- # printf %02d 20 00:08:14.751 19:15:01 -- nvmf/run.sh@34 -- # port=4420 00:08:14.751 19:15:01 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:15.011 19:15:01 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:15.011 19:15:01 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.011 19:15:01 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:15.011 19:15:01 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:15.011 19:15:01 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:15.011 [2024-04-24 19:15:01.802514] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:15.011 [2024-04-24 19:15:01.802591] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1627072 ] 00:08:15.011 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.270 [2024-04-24 19:15:02.113724] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.270 [2024-04-24 19:15:02.201382] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.270 [2024-04-24 19:15:02.260748] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.270 [2024-04-24 19:15:02.276966] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:15.528 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.528 INFO: Seed: 128873273 00:08:15.528 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:15.528 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:15.528 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:15.528 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.528 #2 INITED exec/s: 0 rss: 64Mb 00:08:15.528 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.528 This may also happen if the target rejected all inputs we tried so far 00:08:15.528 [2024-04-24 19:15:02.353356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:15.528 [2024-04-24 19:15:02.353410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.787 NEW_FUNC[1/672]: 0x4a3fb0 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:15.787 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.787 #7 NEW cov: 11739 ft: 11740 corp: 2/31b lim: 90 exec/s: 0 rss: 71Mb L: 30/30 MS: 5 CopyPart-CrossOver-InsertByte-CopyPart-InsertRepeatedBytes- 00:08:15.787 [2024-04-24 19:15:02.694796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:15.787 [2024-04-24 19:15:02.694865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.787 #8 NEW cov: 11869 ft: 12072 corp: 3/61b lim: 90 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 ChangeByte- 00:08:15.787 [2024-04-24 19:15:02.755285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:15.787 [2024-04-24 19:15:02.755317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.787 [2024-04-24 19:15:02.755422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:15.787 [2024-04-24 19:15:02.755438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.787 #9 NEW cov: 11875 ft: 13165 corp: 4/107b lim: 90 exec/s: 0 rss: 72Mb L: 46/46 MS: 1 CopyPart- 00:08:16.046 [2024-04-24 19:15:02.805179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.046 [2024-04-24 19:15:02.805209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.046 #10 NEW cov: 11960 ft: 13455 corp: 5/137b lim: 90 exec/s: 0 rss: 72Mb L: 30/46 MS: 1 ChangeBinInt- 00:08:16.046 [2024-04-24 19:15:02.865779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.046 [2024-04-24 19:15:02.865808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.046 [2024-04-24 19:15:02.865870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.046 [2024-04-24 19:15:02.865891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.046 #11 NEW cov: 11960 ft: 13647 corp: 6/183b lim: 90 exec/s: 0 rss: 72Mb L: 46/46 MS: 1 ShuffleBytes- 00:08:16.046 [2024-04-24 19:15:02.925928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.046 [2024-04-24 19:15:02.925956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.046 [2024-04-24 19:15:02.926062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.046 [2024-04-24 19:15:02.926094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.046 #12 NEW cov: 11960 ft: 13757 corp: 7/229b lim: 90 exec/s: 0 rss: 72Mb L: 46/46 MS: 1 ChangeBit- 00:08:16.046 [2024-04-24 19:15:02.985878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.046 [2024-04-24 19:15:02.985909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.046 #13 NEW cov: 11960 ft: 13842 corp: 8/259b lim: 90 exec/s: 0 rss: 72Mb L: 30/46 MS: 1 ChangeBinInt- 00:08:16.046 [2024-04-24 19:15:03.036389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.046 [2024-04-24 19:15:03.036420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.046 [2024-04-24 19:15:03.036527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.046 [2024-04-24 19:15:03.036547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.304 #14 NEW cov: 11960 ft: 13912 corp: 9/299b lim: 90 exec/s: 0 rss: 72Mb L: 40/46 MS: 1 EraseBytes- 00:08:16.304 [2024-04-24 19:15:03.096289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.304 [2024-04-24 19:15:03.096319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.304 #15 NEW cov: 11960 ft: 13946 corp: 10/329b lim: 90 exec/s: 0 rss: 72Mb L: 30/46 MS: 1 ChangeBit- 00:08:16.304 [2024-04-24 19:15:03.156531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.304 [2024-04-24 19:15:03.156561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.304 #16 NEW cov: 11960 ft: 14001 corp: 11/353b lim: 90 exec/s: 0 rss: 72Mb L: 24/46 MS: 1 EraseBytes- 00:08:16.304 [2024-04-24 19:15:03.216748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.304 [2024-04-24 19:15:03.216784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.304 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.304 #17 NEW cov: 11983 ft: 14049 corp: 12/384b lim: 90 exec/s: 0 rss: 73Mb L: 31/46 MS: 1 InsertByte- 00:08:16.304 [2024-04-24 19:15:03.276954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.304 [2024-04-24 19:15:03.276986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.304 #18 NEW cov: 11983 ft: 14089 corp: 13/415b lim: 90 exec/s: 0 rss: 73Mb L: 31/46 MS: 1 ShuffleBytes- 00:08:16.565 [2024-04-24 19:15:03.337370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.565 [2024-04-24 19:15:03.337400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.565 #19 NEW cov: 11983 ft: 14106 corp: 14/445b lim: 90 exec/s: 19 rss: 73Mb L: 30/46 MS: 1 CrossOver- 00:08:16.565 [2024-04-24 19:15:03.388069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.565 [2024-04-24 19:15:03.388097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.565 [2024-04-24 19:15:03.388188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.565 [2024-04-24 19:15:03.388210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.565 #20 NEW cov: 11983 ft: 14123 corp: 15/491b lim: 90 exec/s: 20 rss: 73Mb L: 46/46 MS: 1 ChangeByte- 00:08:16.566 [2024-04-24 19:15:03.439064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.566 [2024-04-24 19:15:03.439094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.566 [2024-04-24 19:15:03.439223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.566 [2024-04-24 19:15:03.439243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.566 [2024-04-24 19:15:03.439333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.566 [2024-04-24 19:15:03.439368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.566 [2024-04-24 19:15:03.439464] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.566 [2024-04-24 19:15:03.439484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.566 #21 NEW cov: 11983 ft: 14527 corp: 16/565b lim: 90 exec/s: 21 rss: 73Mb L: 74/74 MS: 1 CopyPart- 00:08:16.566 [2024-04-24 19:15:03.499176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.566 [2024-04-24 19:15:03.499207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.566 [2024-04-24 19:15:03.499273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.566 [2024-04-24 19:15:03.499293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.566 [2024-04-24 19:15:03.499345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.566 [2024-04-24 19:15:03.499365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.566 [2024-04-24 19:15:03.499465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.566 [2024-04-24 19:15:03.499499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.566 #22 NEW cov: 11983 ft: 14537 corp: 17/640b lim: 90 exec/s: 22 rss: 73Mb L: 75/75 MS: 1 InsertByte- 00:08:16.566 [2024-04-24 19:15:03.558736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.566 [2024-04-24 19:15:03.558766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.566 [2024-04-24 19:15:03.558864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.566 [2024-04-24 19:15:03.558883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.825 #23 NEW cov: 11983 ft: 14581 corp: 18/679b lim: 90 exec/s: 23 rss: 73Mb L: 39/75 MS: 1 InsertRepeatedBytes- 00:08:16.825 [2024-04-24 19:15:03.619600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.825 [2024-04-24 19:15:03.619629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.825 [2024-04-24 19:15:03.619700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.825 [2024-04-24 19:15:03.619717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.825 [2024-04-24 19:15:03.619774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.825 [2024-04-24 19:15:03.619793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.825 #24 NEW cov: 11983 ft: 14846 corp: 19/740b lim: 90 exec/s: 24 rss: 73Mb L: 61/75 MS: 1 CrossOver- 00:08:16.826 [2024-04-24 19:15:03.669385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.826 [2024-04-24 19:15:03.669414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.826 [2024-04-24 19:15:03.669479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.826 [2024-04-24 19:15:03.669500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.826 #25 NEW cov: 11983 ft: 14911 corp: 20/786b lim: 90 exec/s: 25 rss: 73Mb L: 46/75 MS: 1 CrossOver- 00:08:16.826 [2024-04-24 19:15:03.719531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.826 [2024-04-24 19:15:03.719561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.826 [2024-04-24 19:15:03.719629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.826 [2024-04-24 19:15:03.719649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.826 #26 NEW cov: 11983 ft: 14927 corp: 21/832b lim: 90 exec/s: 26 rss: 73Mb L: 46/75 MS: 1 ChangeBinInt- 00:08:16.826 [2024-04-24 19:15:03.770227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.826 [2024-04-24 19:15:03.770257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.826 [2024-04-24 19:15:03.770335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.826 [2024-04-24 19:15:03.770355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.826 [2024-04-24 19:15:03.770423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.826 [2024-04-24 19:15:03.770444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.826 #27 NEW cov: 11983 ft: 14950 corp: 22/891b lim: 90 exec/s: 27 rss: 73Mb L: 59/75 MS: 1 InsertRepeatedBytes- 00:08:16.826 [2024-04-24 19:15:03.819685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.826 [2024-04-24 19:15:03.819717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.826 #28 NEW cov: 11983 ft: 14956 corp: 23/914b lim: 90 exec/s: 28 rss: 73Mb L: 23/75 MS: 1 CrossOver- 00:08:17.084 [2024-04-24 19:15:03.870505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.084 [2024-04-24 19:15:03.870535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.084 [2024-04-24 19:15:03.870609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.084 [2024-04-24 19:15:03.870629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.084 [2024-04-24 19:15:03.870704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.084 [2024-04-24 19:15:03.870722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.084 #29 NEW cov: 11983 ft: 14960 corp: 24/972b lim: 90 exec/s: 29 rss: 73Mb L: 58/75 MS: 1 CrossOver- 00:08:17.084 [2024-04-24 19:15:03.930033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.084 [2024-04-24 19:15:03.930066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.084 #30 NEW cov: 11983 ft: 14975 corp: 25/1002b lim: 90 exec/s: 30 rss: 73Mb L: 30/75 MS: 1 ChangeBit- 00:08:17.084 [2024-04-24 19:15:03.990817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.084 [2024-04-24 19:15:03.990845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.084 [2024-04-24 19:15:03.990926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.084 [2024-04-24 19:15:03.990946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.084 #31 NEW cov: 11983 ft: 15031 corp: 26/1048b lim: 90 exec/s: 31 rss: 73Mb L: 46/75 MS: 1 CMP- DE: "\001\004"- 00:08:17.084 [2024-04-24 19:15:04.051325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.084 [2024-04-24 19:15:04.051353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.084 [2024-04-24 19:15:04.051424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.084 [2024-04-24 19:15:04.051442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.084 [2024-04-24 19:15:04.051526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.084 [2024-04-24 19:15:04.051544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.084 #32 NEW cov: 11983 ft: 15097 corp: 27/1109b lim: 90 exec/s: 32 rss: 73Mb L: 61/75 MS: 1 CopyPart- 00:08:17.342 [2024-04-24 19:15:04.101005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.342 [2024-04-24 19:15:04.101034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.342 #33 NEW cov: 11983 ft: 15162 corp: 28/1139b lim: 90 exec/s: 33 rss: 73Mb L: 30/75 MS: 1 ShuffleBytes- 00:08:17.342 [2024-04-24 19:15:04.161533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.342 [2024-04-24 19:15:04.161565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.342 [2024-04-24 19:15:04.161665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.342 [2024-04-24 19:15:04.161686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.342 #34 NEW cov: 11983 ft: 15182 corp: 29/1187b lim: 90 exec/s: 34 rss: 73Mb L: 48/75 MS: 1 PersAutoDict- DE: "\001\004"- 00:08:17.342 [2024-04-24 19:15:04.211947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.342 [2024-04-24 19:15:04.211976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.342 [2024-04-24 19:15:04.212046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.342 [2024-04-24 19:15:04.212074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.342 [2024-04-24 19:15:04.212134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.342 [2024-04-24 19:15:04.212153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.342 #35 NEW cov: 11983 ft: 15194 corp: 30/1249b lim: 90 exec/s: 35 rss: 73Mb L: 62/75 MS: 1 InsertByte- 00:08:17.342 [2024-04-24 19:15:04.271550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.342 [2024-04-24 19:15:04.271578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.342 #36 NEW cov: 11983 ft: 15202 corp: 31/1279b lim: 90 exec/s: 36 rss: 74Mb L: 30/75 MS: 1 ChangeByte- 00:08:17.342 [2024-04-24 19:15:04.322446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.342 [2024-04-24 19:15:04.322476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.342 [2024-04-24 19:15:04.322549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.342 [2024-04-24 19:15:04.322567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.342 [2024-04-24 19:15:04.322629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.342 [2024-04-24 19:15:04.322646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.342 #37 NEW cov: 11983 ft: 15270 corp: 32/1342b lim: 90 exec/s: 18 rss: 74Mb L: 63/75 MS: 1 CrossOver- 00:08:17.343 #37 DONE cov: 11983 ft: 15270 corp: 32/1342b lim: 90 exec/s: 18 rss: 74Mb 00:08:17.343 ###### Recommended dictionary. ###### 00:08:17.343 "\001\004" # Uses: 1 00:08:17.343 ###### End of recommended dictionary. ###### 00:08:17.343 Done 37 runs in 2 second(s) 00:08:17.600 19:15:04 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:17.600 19:15:04 -- ../common.sh@72 -- # (( i++ )) 00:08:17.600 19:15:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.600 19:15:04 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:17.600 19:15:04 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:17.600 19:15:04 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.600 19:15:04 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.600 19:15:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:17.600 19:15:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:17.600 19:15:04 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:17.600 19:15:04 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:17.600 19:15:04 -- nvmf/run.sh@34 -- # printf %02d 21 00:08:17.600 19:15:04 -- nvmf/run.sh@34 -- # port=4421 00:08:17.600 19:15:04 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:17.600 19:15:04 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:17.600 19:15:04 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.600 19:15:04 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:17.600 19:15:04 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:17.600 19:15:04 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:17.600 [2024-04-24 19:15:04.532880] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:17.600 [2024-04-24 19:15:04.532954] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1627629 ] 00:08:17.600 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.857 [2024-04-24 19:15:04.807914] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.115 [2024-04-24 19:15:04.888003] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.115 [2024-04-24 19:15:04.947498] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.115 [2024-04-24 19:15:04.963692] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:18.115 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.115 INFO: Seed: 2815868938 00:08:18.115 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:18.115 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:18.115 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:18.115 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.115 #2 INITED exec/s: 0 rss: 64Mb 00:08:18.115 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.115 This may also happen if the target rejected all inputs we tried so far 00:08:18.115 [2024-04-24 19:15:05.030150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.115 [2024-04-24 19:15:05.030190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.115 [2024-04-24 19:15:05.030316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.115 [2024-04-24 19:15:05.030342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.373 NEW_FUNC[1/672]: 0x4a71d0 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:18.373 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.373 #3 NEW cov: 11714 ft: 11715 corp: 2/25b lim: 50 exec/s: 0 rss: 71Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:18.373 [2024-04-24 19:15:05.361176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.373 [2024-04-24 19:15:05.361219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.631 #4 NEW cov: 11844 ft: 13041 corp: 3/43b lim: 50 exec/s: 0 rss: 71Mb L: 18/24 MS: 1 EraseBytes- 00:08:18.631 [2024-04-24 19:15:05.421524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.631 [2024-04-24 19:15:05.421555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.631 [2024-04-24 19:15:05.421648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.631 [2024-04-24 19:15:05.421671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.631 #6 NEW cov: 11850 ft: 13319 corp: 4/68b lim: 50 exec/s: 0 rss: 71Mb L: 25/25 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:18.631 [2024-04-24 19:15:05.471703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.631 [2024-04-24 19:15:05.471732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.631 [2024-04-24 19:15:05.471832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.631 [2024-04-24 19:15:05.471851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.631 #7 NEW cov: 11935 ft: 13583 corp: 5/92b lim: 50 exec/s: 0 rss: 71Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:18.631 [2024-04-24 19:15:05.521921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.631 [2024-04-24 19:15:05.521951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.631 [2024-04-24 19:15:05.522047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.631 [2024-04-24 19:15:05.522067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.631 #8 NEW cov: 11935 ft: 13694 corp: 6/121b lim: 50 exec/s: 0 rss: 72Mb L: 29/29 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:18.631 [2024-04-24 19:15:05.582094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.631 [2024-04-24 19:15:05.582120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.631 [2024-04-24 19:15:05.582213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.631 [2024-04-24 19:15:05.582230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.631 #9 NEW cov: 11935 ft: 13743 corp: 7/146b lim: 50 exec/s: 0 rss: 72Mb L: 25/29 MS: 1 ChangeBit- 00:08:18.631 [2024-04-24 19:15:05.632321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.631 [2024-04-24 19:15:05.632350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.631 [2024-04-24 19:15:05.632459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.631 [2024-04-24 19:15:05.632478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.889 #10 NEW cov: 11935 ft: 13889 corp: 8/172b lim: 50 exec/s: 0 rss: 72Mb L: 26/29 MS: 1 CrossOver- 00:08:18.889 [2024-04-24 19:15:05.692426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.889 [2024-04-24 19:15:05.692457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.889 [2024-04-24 19:15:05.692556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.889 [2024-04-24 19:15:05.692572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.889 #11 NEW cov: 11935 ft: 13927 corp: 9/196b lim: 50 exec/s: 0 rss: 72Mb L: 24/29 MS: 1 CopyPart- 00:08:18.889 [2024-04-24 19:15:05.743294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.889 [2024-04-24 19:15:05.743321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.889 [2024-04-24 19:15:05.743403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.889 [2024-04-24 19:15:05.743420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.889 [2024-04-24 19:15:05.743494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.889 [2024-04-24 19:15:05.743509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.889 [2024-04-24 19:15:05.743596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:18.889 [2024-04-24 19:15:05.743616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.889 #12 NEW cov: 11935 ft: 14330 corp: 10/244b lim: 50 exec/s: 0 rss: 72Mb L: 48/48 MS: 1 CrossOver- 00:08:18.889 [2024-04-24 19:15:05.802840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.889 [2024-04-24 19:15:05.802871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.889 [2024-04-24 19:15:05.802956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.889 [2024-04-24 19:15:05.802975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.889 #13 NEW cov: 11935 ft: 14384 corp: 11/270b lim: 50 exec/s: 0 rss: 72Mb L: 26/48 MS: 1 InsertByte- 00:08:18.889 [2024-04-24 19:15:05.853066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.889 [2024-04-24 19:15:05.853097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.889 [2024-04-24 19:15:05.853202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.889 [2024-04-24 19:15:05.853224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.889 #14 NEW cov: 11935 ft: 14410 corp: 12/294b lim: 50 exec/s: 0 rss: 72Mb L: 24/48 MS: 1 ChangeByte- 00:08:19.147 [2024-04-24 19:15:05.913032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.147 [2024-04-24 19:15:05.913068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.147 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.147 #15 NEW cov: 11958 ft: 14493 corp: 13/309b lim: 50 exec/s: 0 rss: 72Mb L: 15/48 MS: 1 EraseBytes- 00:08:19.147 [2024-04-24 19:15:05.973551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.147 [2024-04-24 19:15:05.973582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.147 [2024-04-24 19:15:05.973660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.147 [2024-04-24 19:15:05.973681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.147 #16 NEW cov: 11958 ft: 14526 corp: 14/333b lim: 50 exec/s: 0 rss: 72Mb L: 24/48 MS: 1 ChangeBinInt- 00:08:19.147 [2024-04-24 19:15:06.033837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.147 [2024-04-24 19:15:06.033866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.147 [2024-04-24 19:15:06.033963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.147 [2024-04-24 19:15:06.033985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.147 #17 NEW cov: 11958 ft: 14560 corp: 15/357b lim: 50 exec/s: 17 rss: 72Mb L: 24/48 MS: 1 CopyPart- 00:08:19.147 [2024-04-24 19:15:06.093925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.147 [2024-04-24 19:15:06.093952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.147 [2024-04-24 19:15:06.094024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.147 [2024-04-24 19:15:06.094041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.147 #18 NEW cov: 11958 ft: 14582 corp: 16/381b lim: 50 exec/s: 18 rss: 72Mb L: 24/48 MS: 1 ChangeByte- 00:08:19.147 [2024-04-24 19:15:06.144518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.147 [2024-04-24 19:15:06.144550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.147 [2024-04-24 19:15:06.144622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.147 [2024-04-24 19:15:06.144640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.147 [2024-04-24 19:15:06.144698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.147 [2024-04-24 19:15:06.144717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.405 #19 NEW cov: 11958 ft: 14844 corp: 17/411b lim: 50 exec/s: 19 rss: 72Mb L: 30/48 MS: 1 InsertRepeatedBytes- 00:08:19.405 [2024-04-24 19:15:06.194514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.405 [2024-04-24 19:15:06.194542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.405 [2024-04-24 19:15:06.194639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.405 [2024-04-24 19:15:06.194656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.405 #20 NEW cov: 11958 ft: 14877 corp: 18/436b lim: 50 exec/s: 20 rss: 72Mb L: 25/48 MS: 1 ChangeBit- 00:08:19.405 [2024-04-24 19:15:06.244624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.406 [2024-04-24 19:15:06.244652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.406 [2024-04-24 19:15:06.244707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.406 [2024-04-24 19:15:06.244724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.406 #26 NEW cov: 11958 ft: 14898 corp: 19/460b lim: 50 exec/s: 26 rss: 73Mb L: 24/48 MS: 1 CopyPart- 00:08:19.406 [2024-04-24 19:15:06.294583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.406 [2024-04-24 19:15:06.294611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.406 #27 NEW cov: 11958 ft: 14909 corp: 20/473b lim: 50 exec/s: 27 rss: 73Mb L: 13/48 MS: 1 EraseBytes- 00:08:19.406 [2024-04-24 19:15:06.345410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.406 [2024-04-24 19:15:06.345437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.406 [2024-04-24 19:15:06.345503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.406 [2024-04-24 19:15:06.345521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.406 [2024-04-24 19:15:06.345586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.406 [2024-04-24 19:15:06.345601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.406 #28 NEW cov: 11958 ft: 14965 corp: 21/503b lim: 50 exec/s: 28 rss: 73Mb L: 30/48 MS: 1 InsertRepeatedBytes- 00:08:19.406 [2024-04-24 19:15:06.405480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.406 [2024-04-24 19:15:06.405509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.406 [2024-04-24 19:15:06.405613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.406 [2024-04-24 19:15:06.405628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.664 #29 NEW cov: 11958 ft: 15000 corp: 22/529b lim: 50 exec/s: 29 rss: 73Mb L: 26/48 MS: 1 ChangeBinInt- 00:08:19.664 [2024-04-24 19:15:06.455753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.664 [2024-04-24 19:15:06.455781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.664 [2024-04-24 19:15:06.455853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.664 [2024-04-24 19:15:06.455870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.664 #30 NEW cov: 11958 ft: 15012 corp: 23/555b lim: 50 exec/s: 30 rss: 73Mb L: 26/48 MS: 1 ChangeBinInt- 00:08:19.664 [2024-04-24 19:15:06.506815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.664 [2024-04-24 19:15:06.506842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.664 [2024-04-24 19:15:06.506911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.664 [2024-04-24 19:15:06.506931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.664 [2024-04-24 19:15:06.507000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.664 [2024-04-24 19:15:06.507017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.664 [2024-04-24 19:15:06.507112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.665 [2024-04-24 19:15:06.507131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.665 [2024-04-24 19:15:06.507224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:19.665 [2024-04-24 19:15:06.507242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.665 #31 NEW cov: 11958 ft: 15068 corp: 24/605b lim: 50 exec/s: 31 rss: 73Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:19.665 [2024-04-24 19:15:06.565917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.665 [2024-04-24 19:15:06.565947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.665 #36 NEW cov: 11958 ft: 15127 corp: 25/617b lim: 50 exec/s: 36 rss: 73Mb L: 12/50 MS: 5 ChangeByte-InsertByte-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:19.665 [2024-04-24 19:15:06.616270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.665 [2024-04-24 19:15:06.616301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.665 [2024-04-24 19:15:06.616370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.665 [2024-04-24 19:15:06.616387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.665 #37 NEW cov: 11958 ft: 15138 corp: 26/643b lim: 50 exec/s: 37 rss: 73Mb L: 26/50 MS: 1 ShuffleBytes- 00:08:19.665 [2024-04-24 19:15:06.676930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.665 [2024-04-24 19:15:06.676956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.665 [2024-04-24 19:15:06.677027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.665 [2024-04-24 19:15:06.677045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.665 [2024-04-24 19:15:06.677129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.665 [2024-04-24 19:15:06.677148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.923 #39 NEW cov: 11958 ft: 15182 corp: 27/681b lim: 50 exec/s: 39 rss: 73Mb L: 38/50 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:19.923 [2024-04-24 19:15:06.727145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.923 [2024-04-24 19:15:06.727173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.727237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.923 [2024-04-24 19:15:06.727254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.727337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.923 [2024-04-24 19:15:06.727354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.923 #41 NEW cov: 11958 ft: 15210 corp: 28/714b lim: 50 exec/s: 41 rss: 73Mb L: 33/50 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:19.923 [2024-04-24 19:15:06.777106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.923 [2024-04-24 19:15:06.777134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.777225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.923 [2024-04-24 19:15:06.777244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.923 #42 NEW cov: 11958 ft: 15226 corp: 29/739b lim: 50 exec/s: 42 rss: 73Mb L: 25/50 MS: 1 ChangeBinInt- 00:08:19.923 [2024-04-24 19:15:06.827830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.923 [2024-04-24 19:15:06.827859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.827920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.923 [2024-04-24 19:15:06.827938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.828006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.923 [2024-04-24 19:15:06.828024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.828118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.923 [2024-04-24 19:15:06.828135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.923 #43 NEW cov: 11958 ft: 15233 corp: 30/787b lim: 50 exec/s: 43 rss: 73Mb L: 48/50 MS: 1 InsertRepeatedBytes- 00:08:19.923 [2024-04-24 19:15:06.877476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.923 [2024-04-24 19:15:06.877504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.877571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.923 [2024-04-24 19:15:06.877590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.923 #44 NEW cov: 11958 ft: 15253 corp: 31/813b lim: 50 exec/s: 44 rss: 73Mb L: 26/50 MS: 1 InsertByte- 00:08:19.923 [2024-04-24 19:15:06.927712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.923 [2024-04-24 19:15:06.927740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.923 [2024-04-24 19:15:06.927798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.923 [2024-04-24 19:15:06.927816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.182 #45 NEW cov: 11958 ft: 15272 corp: 32/837b lim: 50 exec/s: 45 rss: 73Mb L: 24/50 MS: 1 ChangeBit- 00:08:20.182 [2024-04-24 19:15:06.978885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.182 [2024-04-24 19:15:06.978915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.182 [2024-04-24 19:15:06.978985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:20.182 [2024-04-24 19:15:06.979002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.182 [2024-04-24 19:15:06.979102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:20.182 [2024-04-24 19:15:06.979120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.182 [2024-04-24 19:15:06.979218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:20.182 [2024-04-24 19:15:06.979236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.182 [2024-04-24 19:15:06.979330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:20.182 [2024-04-24 19:15:06.979346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.182 #46 NEW cov: 11958 ft: 15274 corp: 33/887b lim: 50 exec/s: 23 rss: 73Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:20.182 #46 DONE cov: 11958 ft: 15274 corp: 33/887b lim: 50 exec/s: 23 rss: 73Mb 00:08:20.182 ###### Recommended dictionary. ###### 00:08:20.182 "\000\000\000\000" # Uses: 1 00:08:20.182 ###### End of recommended dictionary. ###### 00:08:20.182 Done 46 runs in 2 second(s) 00:08:20.182 19:15:07 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:20.182 19:15:07 -- ../common.sh@72 -- # (( i++ )) 00:08:20.182 19:15:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.182 19:15:07 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:20.182 19:15:07 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:20.182 19:15:07 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.182 19:15:07 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.182 19:15:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:20.182 19:15:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:20.182 19:15:07 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:20.182 19:15:07 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:20.182 19:15:07 -- nvmf/run.sh@34 -- # printf %02d 22 00:08:20.182 19:15:07 -- nvmf/run.sh@34 -- # port=4422 00:08:20.182 19:15:07 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:20.182 19:15:07 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:20.182 19:15:07 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.182 19:15:07 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.182 19:15:07 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:20.183 19:15:07 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:20.183 [2024-04-24 19:15:07.187125] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:20.183 [2024-04-24 19:15:07.187197] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1628251 ] 00:08:20.441 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.700 [2024-04-24 19:15:07.466777] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.700 [2024-04-24 19:15:07.552090] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.700 [2024-04-24 19:15:07.611478] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.700 [2024-04-24 19:15:07.627673] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:20.700 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.700 INFO: Seed: 1185915325 00:08:20.700 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:20.700 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:20.700 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:20.700 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.700 #2 INITED exec/s: 0 rss: 64Mb 00:08:20.700 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.700 This may also happen if the target rejected all inputs we tried so far 00:08:20.700 [2024-04-24 19:15:07.672544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:20.700 [2024-04-24 19:15:07.672580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.700 [2024-04-24 19:15:07.672617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:20.700 [2024-04-24 19:15:07.672637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.700 [2024-04-24 19:15:07.672668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:20.700 [2024-04-24 19:15:07.672685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.217 NEW_FUNC[1/672]: 0x4a9490 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:21.217 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.217 #16 NEW cov: 11740 ft: 11741 corp: 2/53b lim: 85 exec/s: 0 rss: 70Mb L: 52/52 MS: 4 InsertByte-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:21.217 [2024-04-24 19:15:08.013324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.217 [2024-04-24 19:15:08.013375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.013413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.217 [2024-04-24 19:15:08.013429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.013458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.217 [2024-04-24 19:15:08.013474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.217 #17 NEW cov: 11870 ft: 12186 corp: 3/105b lim: 85 exec/s: 0 rss: 71Mb L: 52/52 MS: 1 ChangeBinInt- 00:08:21.217 [2024-04-24 19:15:08.083404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.217 [2024-04-24 19:15:08.083436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.083470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.217 [2024-04-24 19:15:08.083488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.083520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.217 [2024-04-24 19:15:08.083537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.217 #18 NEW cov: 11876 ft: 12484 corp: 4/157b lim: 85 exec/s: 0 rss: 71Mb L: 52/52 MS: 1 ChangeByte- 00:08:21.217 [2024-04-24 19:15:08.133515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.217 [2024-04-24 19:15:08.133546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.133580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.217 [2024-04-24 19:15:08.133598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.133628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.217 [2024-04-24 19:15:08.133645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.217 #19 NEW cov: 11961 ft: 12699 corp: 5/209b lim: 85 exec/s: 0 rss: 71Mb L: 52/52 MS: 1 ChangeBinInt- 00:08:21.217 [2024-04-24 19:15:08.183652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.217 [2024-04-24 19:15:08.183681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.183729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.217 [2024-04-24 19:15:08.183748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.217 [2024-04-24 19:15:08.183778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.217 [2024-04-24 19:15:08.183795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.217 #20 NEW cov: 11961 ft: 12790 corp: 6/262b lim: 85 exec/s: 0 rss: 72Mb L: 53/53 MS: 1 InsertByte- 00:08:21.477 [2024-04-24 19:15:08.253824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.477 [2024-04-24 19:15:08.253853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.253902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.477 [2024-04-24 19:15:08.253920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.253950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.477 [2024-04-24 19:15:08.253967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.477 #21 NEW cov: 11961 ft: 12850 corp: 7/314b lim: 85 exec/s: 0 rss: 72Mb L: 52/53 MS: 1 CrossOver- 00:08:21.477 [2024-04-24 19:15:08.303932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.477 [2024-04-24 19:15:08.303963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.303997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.477 [2024-04-24 19:15:08.304015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.304046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.477 [2024-04-24 19:15:08.304068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.477 #27 NEW cov: 11961 ft: 12904 corp: 8/366b lim: 85 exec/s: 0 rss: 72Mb L: 52/53 MS: 1 ChangeBinInt- 00:08:21.477 [2024-04-24 19:15:08.374172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.477 [2024-04-24 19:15:08.374207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.374258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.477 [2024-04-24 19:15:08.374278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.374309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.477 [2024-04-24 19:15:08.374327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.477 #28 NEW cov: 11961 ft: 12954 corp: 9/418b lim: 85 exec/s: 0 rss: 72Mb L: 52/53 MS: 1 ChangeBinInt- 00:08:21.477 [2024-04-24 19:15:08.424266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.477 [2024-04-24 19:15:08.424298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.424348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.477 [2024-04-24 19:15:08.424367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.477 [2024-04-24 19:15:08.424398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.477 [2024-04-24 19:15:08.424415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.477 #29 NEW cov: 11961 ft: 12990 corp: 10/470b lim: 85 exec/s: 0 rss: 72Mb L: 52/53 MS: 1 CopyPart- 00:08:21.736 [2024-04-24 19:15:08.494603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.736 [2024-04-24 19:15:08.494638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.494673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.736 [2024-04-24 19:15:08.494691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.494722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.736 [2024-04-24 19:15:08.494740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.494769] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:21.736 [2024-04-24 19:15:08.494787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.736 #35 NEW cov: 11961 ft: 13430 corp: 11/540b lim: 85 exec/s: 0 rss: 72Mb L: 70/70 MS: 1 CopyPart- 00:08:21.736 [2024-04-24 19:15:08.554578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.736 [2024-04-24 19:15:08.554609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.554658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.736 [2024-04-24 19:15:08.554677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.554708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.736 [2024-04-24 19:15:08.554724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.736 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.736 #36 NEW cov: 11978 ft: 13473 corp: 12/592b lim: 85 exec/s: 0 rss: 72Mb L: 52/70 MS: 1 ChangeASCIIInt- 00:08:21.736 [2024-04-24 19:15:08.624785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.736 [2024-04-24 19:15:08.624815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.624865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.736 [2024-04-24 19:15:08.624883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.624913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.736 [2024-04-24 19:15:08.624931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.736 #37 NEW cov: 11978 ft: 13492 corp: 13/644b lim: 85 exec/s: 0 rss: 72Mb L: 52/70 MS: 1 CMP- DE: "\001\000\000\014"- 00:08:21.736 [2024-04-24 19:15:08.674917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.736 [2024-04-24 19:15:08.674946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.674995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.736 [2024-04-24 19:15:08.675013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.675044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.736 [2024-04-24 19:15:08.675068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.736 #38 NEW cov: 11978 ft: 13551 corp: 14/703b lim: 85 exec/s: 38 rss: 72Mb L: 59/70 MS: 1 CrossOver- 00:08:21.736 [2024-04-24 19:15:08.745126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.736 [2024-04-24 19:15:08.745157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.745191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.736 [2024-04-24 19:15:08.745209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.736 [2024-04-24 19:15:08.745239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.736 [2024-04-24 19:15:08.745256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.995 #39 NEW cov: 11978 ft: 13588 corp: 15/755b lim: 85 exec/s: 39 rss: 73Mb L: 52/70 MS: 1 CopyPart- 00:08:21.995 [2024-04-24 19:15:08.795280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.995 [2024-04-24 19:15:08.795310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.995 [2024-04-24 19:15:08.795359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.995 [2024-04-24 19:15:08.795377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.995 [2024-04-24 19:15:08.795407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.995 [2024-04-24 19:15:08.795424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.995 #40 NEW cov: 11978 ft: 13606 corp: 16/807b lim: 85 exec/s: 40 rss: 73Mb L: 52/70 MS: 1 ChangeBinInt- 00:08:21.995 [2024-04-24 19:15:08.865444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.995 [2024-04-24 19:15:08.865474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.995 [2024-04-24 19:15:08.865508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.995 [2024-04-24 19:15:08.865526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.995 [2024-04-24 19:15:08.865556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.995 [2024-04-24 19:15:08.865572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.995 #41 NEW cov: 11978 ft: 13619 corp: 17/859b lim: 85 exec/s: 41 rss: 73Mb L: 52/70 MS: 1 ChangeBinInt- 00:08:21.995 [2024-04-24 19:15:08.935598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.995 [2024-04-24 19:15:08.935627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.995 [2024-04-24 19:15:08.935662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.995 [2024-04-24 19:15:08.935680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.995 #42 NEW cov: 11978 ft: 13959 corp: 18/894b lim: 85 exec/s: 42 rss: 73Mb L: 35/70 MS: 1 EraseBytes- 00:08:21.995 [2024-04-24 19:15:09.005804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.995 [2024-04-24 19:15:09.005834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.995 [2024-04-24 19:15:09.005887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.995 [2024-04-24 19:15:09.005905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.995 [2024-04-24 19:15:09.005935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.995 [2024-04-24 19:15:09.005952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.254 #43 NEW cov: 11978 ft: 13993 corp: 19/946b lim: 85 exec/s: 43 rss: 73Mb L: 52/70 MS: 1 ChangeBinInt- 00:08:22.254 [2024-04-24 19:15:09.075997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.254 [2024-04-24 19:15:09.076028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.076069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.254 [2024-04-24 19:15:09.076088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.076119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.254 [2024-04-24 19:15:09.076136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.254 #44 NEW cov: 11978 ft: 14004 corp: 20/998b lim: 85 exec/s: 44 rss: 73Mb L: 52/70 MS: 1 PersAutoDict- DE: "\001\000\000\014"- 00:08:22.254 [2024-04-24 19:15:09.146182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.254 [2024-04-24 19:15:09.146212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.146246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.254 [2024-04-24 19:15:09.146264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.146295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.254 [2024-04-24 19:15:09.146311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.254 #45 NEW cov: 11978 ft: 14012 corp: 21/1051b lim: 85 exec/s: 45 rss: 73Mb L: 53/70 MS: 1 ChangeBinInt- 00:08:22.254 [2024-04-24 19:15:09.216378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.254 [2024-04-24 19:15:09.216407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.216441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.254 [2024-04-24 19:15:09.216459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.216489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.254 [2024-04-24 19:15:09.216506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.254 #46 NEW cov: 11978 ft: 14054 corp: 22/1103b lim: 85 exec/s: 46 rss: 73Mb L: 52/70 MS: 1 CMP- DE: "\005\000"- 00:08:22.254 [2024-04-24 19:15:09.266494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.254 [2024-04-24 19:15:09.266525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.266559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.254 [2024-04-24 19:15:09.266582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.254 [2024-04-24 19:15:09.266613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.254 [2024-04-24 19:15:09.266631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.514 #47 NEW cov: 11978 ft: 14064 corp: 23/1156b lim: 85 exec/s: 47 rss: 73Mb L: 53/70 MS: 1 InsertByte- 00:08:22.514 [2024-04-24 19:15:09.316605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.514 [2024-04-24 19:15:09.316635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.316669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.514 [2024-04-24 19:15:09.316686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.316717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.514 [2024-04-24 19:15:09.316734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.514 #48 NEW cov: 11978 ft: 14080 corp: 24/1208b lim: 85 exec/s: 48 rss: 73Mb L: 52/70 MS: 1 ChangeBit- 00:08:22.514 [2024-04-24 19:15:09.366779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.514 [2024-04-24 19:15:09.366809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.366842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.514 [2024-04-24 19:15:09.366860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.366890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.514 [2024-04-24 19:15:09.366906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.366935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:22.514 [2024-04-24 19:15:09.366951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.514 #49 NEW cov: 11978 ft: 14093 corp: 25/1289b lim: 85 exec/s: 49 rss: 73Mb L: 81/81 MS: 1 CopyPart- 00:08:22.514 [2024-04-24 19:15:09.417638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.514 [2024-04-24 19:15:09.417666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.417723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.514 [2024-04-24 19:15:09.417740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.417817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.514 [2024-04-24 19:15:09.417834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.514 #50 NEW cov: 11978 ft: 14378 corp: 26/1341b lim: 85 exec/s: 50 rss: 73Mb L: 52/81 MS: 1 ShuffleBytes- 00:08:22.514 [2024-04-24 19:15:09.457766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.514 [2024-04-24 19:15:09.457793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.457856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.514 [2024-04-24 19:15:09.457871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.457930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.514 [2024-04-24 19:15:09.457946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.514 #51 NEW cov: 11978 ft: 14412 corp: 27/1393b lim: 85 exec/s: 51 rss: 73Mb L: 52/81 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:22.514 [2024-04-24 19:15:09.497714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.514 [2024-04-24 19:15:09.497740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.514 [2024-04-24 19:15:09.497794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.514 [2024-04-24 19:15:09.497810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.514 #52 NEW cov: 11978 ft: 14477 corp: 28/1433b lim: 85 exec/s: 52 rss: 73Mb L: 40/81 MS: 1 EraseBytes- 00:08:22.773 [2024-04-24 19:15:09.538591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.773 [2024-04-24 19:15:09.538669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.773 [2024-04-24 19:15:09.538792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.773 [2024-04-24 19:15:09.538837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.773 [2024-04-24 19:15:09.538949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.773 [2024-04-24 19:15:09.538993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.773 [2024-04-24 19:15:09.539121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:22.773 [2024-04-24 19:15:09.539165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.774 #53 NEW cov: 11985 ft: 14637 corp: 29/1514b lim: 85 exec/s: 53 rss: 73Mb L: 81/81 MS: 1 ChangeByte- 00:08:22.774 [2024-04-24 19:15:09.598128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.774 [2024-04-24 19:15:09.598153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.774 [2024-04-24 19:15:09.598216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.774 [2024-04-24 19:15:09.598232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.774 [2024-04-24 19:15:09.598288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.774 [2024-04-24 19:15:09.598303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.774 #59 NEW cov: 11985 ft: 14651 corp: 30/1566b lim: 85 exec/s: 59 rss: 73Mb L: 52/81 MS: 1 ShuffleBytes- 00:08:22.774 [2024-04-24 19:15:09.638231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.774 [2024-04-24 19:15:09.638258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.774 [2024-04-24 19:15:09.638316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.774 [2024-04-24 19:15:09.638333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.774 [2024-04-24 19:15:09.638393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.774 [2024-04-24 19:15:09.638410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.774 #60 NEW cov: 11985 ft: 14676 corp: 31/1618b lim: 85 exec/s: 30 rss: 73Mb L: 52/81 MS: 1 CrossOver- 00:08:22.774 #60 DONE cov: 11985 ft: 14676 corp: 31/1618b lim: 85 exec/s: 30 rss: 73Mb 00:08:22.774 ###### Recommended dictionary. ###### 00:08:22.774 "\001\000\000\014" # Uses: 1 00:08:22.774 "\005\000" # Uses: 0 00:08:22.774 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:22.774 ###### End of recommended dictionary. ###### 00:08:22.774 Done 60 runs in 2 second(s) 00:08:23.032 19:15:09 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.032 19:15:09 -- ../common.sh@72 -- # (( i++ )) 00:08:23.032 19:15:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.032 19:15:09 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:23.032 19:15:09 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:23.032 19:15:09 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.032 19:15:09 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.032 19:15:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:23.032 19:15:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:23.032 19:15:09 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.032 19:15:09 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.032 19:15:09 -- nvmf/run.sh@34 -- # printf %02d 23 00:08:23.032 19:15:09 -- nvmf/run.sh@34 -- # port=4423 00:08:23.032 19:15:09 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:23.032 19:15:09 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:23.032 19:15:09 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.032 19:15:09 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.032 19:15:09 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.032 19:15:09 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:23.032 [2024-04-24 19:15:09.850787] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:23.032 [2024-04-24 19:15:09.850864] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1628602 ] 00:08:23.032 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.291 [2024-04-24 19:15:10.133015] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.291 [2024-04-24 19:15:10.217144] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.291 [2024-04-24 19:15:10.276855] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.291 [2024-04-24 19:15:10.293057] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:23.291 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.291 INFO: Seed: 3848911085 00:08:23.549 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:23.549 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:23.549 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:23.549 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.549 #2 INITED exec/s: 0 rss: 64Mb 00:08:23.549 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.549 This may also happen if the target rejected all inputs we tried so far 00:08:23.549 [2024-04-24 19:15:10.351827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.549 [2024-04-24 19:15:10.351860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.549 [2024-04-24 19:15:10.351898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:23.549 [2024-04-24 19:15:10.351915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.549 [2024-04-24 19:15:10.351972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:23.549 [2024-04-24 19:15:10.351987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.549 [2024-04-24 19:15:10.352046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:23.549 [2024-04-24 19:15:10.352065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.808 NEW_FUNC[1/665]: 0x4ac6c0 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:23.808 NEW_FUNC[2/665]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.808 #5 NEW cov: 11628 ft: 11630 corp: 2/24b lim: 25 exec/s: 0 rss: 70Mb L: 23/23 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:08:23.808 [2024-04-24 19:15:10.692831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.808 [2024-04-24 19:15:10.692901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.808 [2024-04-24 19:15:10.692998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:23.808 [2024-04-24 19:15:10.693029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.808 [2024-04-24 19:15:10.693118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:23.808 [2024-04-24 19:15:10.693146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.808 [2024-04-24 19:15:10.693228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:23.808 [2024-04-24 19:15:10.693257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.808 NEW_FUNC[1/6]: 0xf21f40 in rte_get_tsc_cycles /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/include/rte_cycles.h:61 00:08:23.808 NEW_FUNC[2/6]: 0x16f3920 in spdk_nvme_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:757 00:08:23.808 #10 NEW cov: 11803 ft: 12256 corp: 3/48b lim: 25 exec/s: 0 rss: 71Mb L: 24/24 MS: 5 ChangeBit-ShuffleBytes-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:23.808 [2024-04-24 19:15:10.742305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.808 [2024-04-24 19:15:10.742335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.808 #12 NEW cov: 11809 ft: 13175 corp: 4/54b lim: 25 exec/s: 0 rss: 71Mb L: 6/24 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:23.808 [2024-04-24 19:15:10.782438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.808 [2024-04-24 19:15:10.782466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.808 #13 NEW cov: 11894 ft: 13429 corp: 5/60b lim: 25 exec/s: 0 rss: 71Mb L: 6/24 MS: 1 ChangeByte- 00:08:24.066 [2024-04-24 19:15:10.832971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.066 [2024-04-24 19:15:10.832999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.066 [2024-04-24 19:15:10.833046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.066 [2024-04-24 19:15:10.833066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.066 [2024-04-24 19:15:10.833122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.066 [2024-04-24 19:15:10.833138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.066 [2024-04-24 19:15:10.833193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.066 [2024-04-24 19:15:10.833208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.066 #14 NEW cov: 11894 ft: 13526 corp: 6/83b lim: 25 exec/s: 0 rss: 72Mb L: 23/24 MS: 1 ShuffleBytes- 00:08:24.066 [2024-04-24 19:15:10.872701] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.066 [2024-04-24 19:15:10.872727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.066 #15 NEW cov: 11894 ft: 13572 corp: 7/89b lim: 25 exec/s: 0 rss: 72Mb L: 6/24 MS: 1 CrossOver- 00:08:24.066 [2024-04-24 19:15:10.912797] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.066 [2024-04-24 19:15:10.912824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.066 #16 NEW cov: 11894 ft: 13689 corp: 8/95b lim: 25 exec/s: 0 rss: 72Mb L: 6/24 MS: 1 CopyPart- 00:08:24.066 [2024-04-24 19:15:10.962939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.066 [2024-04-24 19:15:10.962967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.066 #17 NEW cov: 11894 ft: 13774 corp: 9/101b lim: 25 exec/s: 0 rss: 72Mb L: 6/24 MS: 1 ChangeByte- 00:08:24.066 [2024-04-24 19:15:11.003023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.066 [2024-04-24 19:15:11.003050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.066 #23 NEW cov: 11894 ft: 13820 corp: 10/107b lim: 25 exec/s: 0 rss: 72Mb L: 6/24 MS: 1 ShuffleBytes- 00:08:24.066 [2024-04-24 19:15:11.043534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.066 [2024-04-24 19:15:11.043562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.066 [2024-04-24 19:15:11.043598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.066 [2024-04-24 19:15:11.043614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.066 [2024-04-24 19:15:11.043670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.066 [2024-04-24 19:15:11.043686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.066 [2024-04-24 19:15:11.043741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.066 [2024-04-24 19:15:11.043756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.066 #24 NEW cov: 11894 ft: 13845 corp: 11/131b lim: 25 exec/s: 0 rss: 72Mb L: 24/24 MS: 1 InsertByte- 00:08:24.325 [2024-04-24 19:15:11.083753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.325 [2024-04-24 19:15:11.083782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.083836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.325 [2024-04-24 19:15:11.083853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.083914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.325 [2024-04-24 19:15:11.083932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.083989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.325 [2024-04-24 19:15:11.084006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.084066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:24.325 [2024-04-24 19:15:11.084082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.325 #25 NEW cov: 11894 ft: 13944 corp: 12/156b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 InsertByte- 00:08:24.325 [2024-04-24 19:15:11.133505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.325 [2024-04-24 19:15:11.133535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.325 #31 NEW cov: 11894 ft: 13961 corp: 13/162b lim: 25 exec/s: 0 rss: 72Mb L: 6/25 MS: 1 CopyPart- 00:08:24.325 [2024-04-24 19:15:11.174021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.325 [2024-04-24 19:15:11.174052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.174113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.325 [2024-04-24 19:15:11.174130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.174184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.325 [2024-04-24 19:15:11.174199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.174254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.325 [2024-04-24 19:15:11.174271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.174324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:24.325 [2024-04-24 19:15:11.174339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.325 #32 NEW cov: 11894 ft: 13977 corp: 14/187b lim: 25 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 ChangeBit- 00:08:24.325 [2024-04-24 19:15:11.224023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.325 [2024-04-24 19:15:11.224050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.325 [2024-04-24 19:15:11.224121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.326 [2024-04-24 19:15:11.224140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.224196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.326 [2024-04-24 19:15:11.224212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.224267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.326 [2024-04-24 19:15:11.224283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.326 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.326 #33 NEW cov: 11917 ft: 14026 corp: 15/210b lim: 25 exec/s: 0 rss: 72Mb L: 23/25 MS: 1 ShuffleBytes- 00:08:24.326 [2024-04-24 19:15:11.264126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.326 [2024-04-24 19:15:11.264152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.264205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.326 [2024-04-24 19:15:11.264222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.264277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.326 [2024-04-24 19:15:11.264293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.264348] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.326 [2024-04-24 19:15:11.264363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.326 #34 NEW cov: 11917 ft: 14051 corp: 16/234b lim: 25 exec/s: 0 rss: 72Mb L: 24/25 MS: 1 ChangeBit- 00:08:24.326 [2024-04-24 19:15:11.304265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.326 [2024-04-24 19:15:11.304294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.304341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.326 [2024-04-24 19:15:11.304357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.304413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.326 [2024-04-24 19:15:11.304430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.326 [2024-04-24 19:15:11.304486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.326 [2024-04-24 19:15:11.304502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.326 #35 NEW cov: 11917 ft: 14071 corp: 17/254b lim: 25 exec/s: 0 rss: 73Mb L: 20/25 MS: 1 CrossOver- 00:08:24.585 [2024-04-24 19:15:11.344045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.585 [2024-04-24 19:15:11.344079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.585 #36 NEW cov: 11917 ft: 14144 corp: 18/260b lim: 25 exec/s: 36 rss: 73Mb L: 6/25 MS: 1 ChangeBit- 00:08:24.585 [2024-04-24 19:15:11.384511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.585 [2024-04-24 19:15:11.384539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.384582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.585 [2024-04-24 19:15:11.384597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.384649] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.585 [2024-04-24 19:15:11.384665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.384721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.585 [2024-04-24 19:15:11.384738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.585 #37 NEW cov: 11917 ft: 14188 corp: 19/283b lim: 25 exec/s: 37 rss: 73Mb L: 23/25 MS: 1 ChangeByte- 00:08:24.585 [2024-04-24 19:15:11.424754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.585 [2024-04-24 19:15:11.424780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.424838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.585 [2024-04-24 19:15:11.424854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.424906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.585 [2024-04-24 19:15:11.424921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.424975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.585 [2024-04-24 19:15:11.424990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.425044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:24.585 [2024-04-24 19:15:11.425064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.585 #38 NEW cov: 11917 ft: 14227 corp: 20/308b lim: 25 exec/s: 38 rss: 73Mb L: 25/25 MS: 1 CrossOver- 00:08:24.585 [2024-04-24 19:15:11.464427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.585 [2024-04-24 19:15:11.464453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.585 #39 NEW cov: 11917 ft: 14322 corp: 21/315b lim: 25 exec/s: 39 rss: 73Mb L: 7/25 MS: 1 InsertByte- 00:08:24.585 [2024-04-24 19:15:11.504511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.585 [2024-04-24 19:15:11.504537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.585 #42 NEW cov: 11917 ft: 14337 corp: 22/323b lim: 25 exec/s: 42 rss: 73Mb L: 8/25 MS: 3 EraseBytes-CopyPart-CopyPart- 00:08:24.585 [2024-04-24 19:15:11.545097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.585 [2024-04-24 19:15:11.545125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.545178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.585 [2024-04-24 19:15:11.545193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.545246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.585 [2024-04-24 19:15:11.545265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.545319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.585 [2024-04-24 19:15:11.545335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.545389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:24.585 [2024-04-24 19:15:11.545405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.585 #48 NEW cov: 11917 ft: 14341 corp: 23/348b lim: 25 exec/s: 48 rss: 73Mb L: 25/25 MS: 1 InsertByte- 00:08:24.585 [2024-04-24 19:15:11.585077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.585 [2024-04-24 19:15:11.585103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.585159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.585 [2024-04-24 19:15:11.585175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.585230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.585 [2024-04-24 19:15:11.585244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.585 [2024-04-24 19:15:11.585301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.585 [2024-04-24 19:15:11.585316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.844 #49 NEW cov: 11917 ft: 14364 corp: 24/371b lim: 25 exec/s: 49 rss: 73Mb L: 23/25 MS: 1 ChangeBinInt- 00:08:24.844 [2024-04-24 19:15:11.624850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.844 [2024-04-24 19:15:11.624877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.844 #50 NEW cov: 11917 ft: 14372 corp: 25/377b lim: 25 exec/s: 50 rss: 73Mb L: 6/25 MS: 1 ChangeBinInt- 00:08:24.844 [2024-04-24 19:15:11.665340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.844 [2024-04-24 19:15:11.665368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.665415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.844 [2024-04-24 19:15:11.665430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.665484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.844 [2024-04-24 19:15:11.665502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.665557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.844 [2024-04-24 19:15:11.665573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.844 #51 NEW cov: 11917 ft: 14389 corp: 26/399b lim: 25 exec/s: 51 rss: 73Mb L: 22/25 MS: 1 EraseBytes- 00:08:24.844 [2024-04-24 19:15:11.705111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.844 [2024-04-24 19:15:11.705138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.844 #52 NEW cov: 11917 ft: 14394 corp: 27/405b lim: 25 exec/s: 52 rss: 73Mb L: 6/25 MS: 1 ChangeBinInt- 00:08:24.844 [2024-04-24 19:15:11.745544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.844 [2024-04-24 19:15:11.745571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.745623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.844 [2024-04-24 19:15:11.745639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.745693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.844 [2024-04-24 19:15:11.745708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.745762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.844 [2024-04-24 19:15:11.745778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.844 #53 NEW cov: 11917 ft: 14399 corp: 28/428b lim: 25 exec/s: 53 rss: 73Mb L: 23/25 MS: 1 ChangeByte- 00:08:24.844 [2024-04-24 19:15:11.785322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.844 [2024-04-24 19:15:11.785349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.844 #54 NEW cov: 11917 ft: 14414 corp: 29/435b lim: 25 exec/s: 54 rss: 73Mb L: 7/25 MS: 1 ChangeByte- 00:08:24.844 [2024-04-24 19:15:11.825774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.844 [2024-04-24 19:15:11.825802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.825853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.844 [2024-04-24 19:15:11.825869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.825921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.844 [2024-04-24 19:15:11.825936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.844 [2024-04-24 19:15:11.825988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.844 [2024-04-24 19:15:11.826003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.844 #55 NEW cov: 11917 ft: 14450 corp: 30/455b lim: 25 exec/s: 55 rss: 73Mb L: 20/25 MS: 1 ChangeByte- 00:08:25.102 [2024-04-24 19:15:11.865552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.102 [2024-04-24 19:15:11.865579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.102 #56 NEW cov: 11917 ft: 14453 corp: 31/461b lim: 25 exec/s: 56 rss: 74Mb L: 6/25 MS: 1 ChangeBit- 00:08:25.102 [2024-04-24 19:15:11.906022] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.102 [2024-04-24 19:15:11.906049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.102 [2024-04-24 19:15:11.906105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.102 [2024-04-24 19:15:11.906122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.102 [2024-04-24 19:15:11.906179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.102 [2024-04-24 19:15:11.906195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.102 [2024-04-24 19:15:11.906250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:25.102 [2024-04-24 19:15:11.906265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.102 #57 NEW cov: 11917 ft: 14463 corp: 32/484b lim: 25 exec/s: 57 rss: 74Mb L: 23/25 MS: 1 ChangeBinInt- 00:08:25.102 [2024-04-24 19:15:11.945760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.102 [2024-04-24 19:15:11.945786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.102 #58 NEW cov: 11917 ft: 14475 corp: 33/490b lim: 25 exec/s: 58 rss: 74Mb L: 6/25 MS: 1 ShuffleBytes- 00:08:25.102 [2024-04-24 19:15:11.985884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.102 [2024-04-24 19:15:11.985910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.102 #59 NEW cov: 11917 ft: 14481 corp: 34/496b lim: 25 exec/s: 59 rss: 74Mb L: 6/25 MS: 1 ChangeBinInt- 00:08:25.102 [2024-04-24 19:15:12.026354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.102 [2024-04-24 19:15:12.026381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.102 [2024-04-24 19:15:12.026433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.102 [2024-04-24 19:15:12.026449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.102 [2024-04-24 19:15:12.026505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.102 [2024-04-24 19:15:12.026521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.102 [2024-04-24 19:15:12.026576] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:25.102 [2024-04-24 19:15:12.026592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.102 #60 NEW cov: 11917 ft: 14502 corp: 35/519b lim: 25 exec/s: 60 rss: 74Mb L: 23/25 MS: 1 ChangeBinInt- 00:08:25.102 [2024-04-24 19:15:12.066075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.102 [2024-04-24 19:15:12.066100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.102 #61 NEW cov: 11917 ft: 14506 corp: 36/526b lim: 25 exec/s: 61 rss: 74Mb L: 7/25 MS: 1 InsertByte- 00:08:25.102 [2024-04-24 19:15:12.106219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.102 [2024-04-24 19:15:12.106245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.360 #62 NEW cov: 11917 ft: 14516 corp: 37/532b lim: 25 exec/s: 62 rss: 74Mb L: 6/25 MS: 1 ChangeBinInt- 00:08:25.360 [2024-04-24 19:15:12.146309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.360 [2024-04-24 19:15:12.146336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.360 #63 NEW cov: 11917 ft: 14532 corp: 38/538b lim: 25 exec/s: 63 rss: 74Mb L: 6/25 MS: 1 ChangeByte- 00:08:25.360 [2024-04-24 19:15:12.186439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.360 [2024-04-24 19:15:12.186471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.360 #64 NEW cov: 11917 ft: 14542 corp: 39/544b lim: 25 exec/s: 64 rss: 74Mb L: 6/25 MS: 1 ChangeByte- 00:08:25.360 [2024-04-24 19:15:12.226570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.360 [2024-04-24 19:15:12.226597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.360 #65 NEW cov: 11917 ft: 14568 corp: 40/552b lim: 25 exec/s: 65 rss: 74Mb L: 8/25 MS: 1 CMP- DE: "\010\000"- 00:08:25.360 [2024-04-24 19:15:12.266673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.360 [2024-04-24 19:15:12.266700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.360 #66 NEW cov: 11917 ft: 14578 corp: 41/558b lim: 25 exec/s: 66 rss: 74Mb L: 6/25 MS: 1 ChangeByte- 00:08:25.360 [2024-04-24 19:15:12.307284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.360 [2024-04-24 19:15:12.307312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.360 [2024-04-24 19:15:12.307379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.360 [2024-04-24 19:15:12.307396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.360 [2024-04-24 19:15:12.307451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.360 [2024-04-24 19:15:12.307465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.360 [2024-04-24 19:15:12.307520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:25.360 [2024-04-24 19:15:12.307536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.360 [2024-04-24 19:15:12.307593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:25.360 [2024-04-24 19:15:12.307609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:25.360 #67 NEW cov: 11917 ft: 14581 corp: 42/583b lim: 25 exec/s: 67 rss: 74Mb L: 25/25 MS: 1 CopyPart- 00:08:25.360 [2024-04-24 19:15:12.347374] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.360 [2024-04-24 19:15:12.347400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.360 [2024-04-24 19:15:12.347452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.360 [2024-04-24 19:15:12.347469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.360 [2024-04-24 19:15:12.347537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.360 [2024-04-24 19:15:12.347554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.360 [2024-04-24 19:15:12.347611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:25.361 [2024-04-24 19:15:12.347627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.361 [2024-04-24 19:15:12.347683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:25.361 [2024-04-24 19:15:12.347699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:25.361 #68 NEW cov: 11917 ft: 14586 corp: 43/608b lim: 25 exec/s: 34 rss: 74Mb L: 25/25 MS: 1 PersAutoDict- DE: "\010\000"- 00:08:25.361 #68 DONE cov: 11917 ft: 14586 corp: 43/608b lim: 25 exec/s: 34 rss: 74Mb 00:08:25.361 ###### Recommended dictionary. ###### 00:08:25.361 "\010\000" # Uses: 1 00:08:25.361 ###### End of recommended dictionary. ###### 00:08:25.361 Done 68 runs in 2 second(s) 00:08:25.618 19:15:12 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.618 19:15:12 -- ../common.sh@72 -- # (( i++ )) 00:08:25.618 19:15:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.618 19:15:12 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:25.618 19:15:12 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:25.618 19:15:12 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.618 19:15:12 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.618 19:15:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:25.618 19:15:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:25.618 19:15:12 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.618 19:15:12 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.618 19:15:12 -- nvmf/run.sh@34 -- # printf %02d 24 00:08:25.618 19:15:12 -- nvmf/run.sh@34 -- # port=4424 00:08:25.618 19:15:12 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:25.618 19:15:12 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:25.618 19:15:12 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.618 19:15:12 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.618 19:15:12 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.618 19:15:12 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:25.618 [2024-04-24 19:15:12.552965] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:25.618 [2024-04-24 19:15:12.553040] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1628970 ] 00:08:25.618 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.876 [2024-04-24 19:15:12.840659] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.134 [2024-04-24 19:15:12.928800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.134 [2024-04-24 19:15:12.988052] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.134 [2024-04-24 19:15:13.004267] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:26.134 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.134 INFO: Seed: 2266934344 00:08:26.134 INFO: Loaded 1 modules (348566 inline 8-bit counters): 348566 [0x28b5f4c, 0x290b0e2), 00:08:26.134 INFO: Loaded 1 PC tables (348566 PCs): 348566 [0x290b0e8,0x2e5ca48), 00:08:26.134 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:26.134 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.134 #2 INITED exec/s: 0 rss: 64Mb 00:08:26.134 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.134 This may also happen if the target rejected all inputs we tried so far 00:08:26.134 [2024-04-24 19:15:13.049725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.134 [2024-04-24 19:15:13.049757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.134 [2024-04-24 19:15:13.049797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.134 [2024-04-24 19:15:13.049816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.134 [2024-04-24 19:15:13.049871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.134 [2024-04-24 19:15:13.049887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.392 NEW_FUNC[1/672]: 0x4ad7a0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:26.392 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.392 #15 NEW cov: 11745 ft: 11745 corp: 2/77b lim: 100 exec/s: 0 rss: 70Mb L: 76/76 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:26.392 [2024-04-24 19:15:13.380552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15914876658197200860 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.392 [2024-04-24 19:15:13.380596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.392 [2024-04-24 19:15:13.380654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.392 [2024-04-24 19:15:13.380670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.392 [2024-04-24 19:15:13.380725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.392 [2024-04-24 19:15:13.380741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.649 #21 NEW cov: 11875 ft: 12319 corp: 3/156b lim: 100 exec/s: 0 rss: 71Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:26.649 [2024-04-24 19:15:13.430308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.430339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.649 #22 NEW cov: 11881 ft: 13302 corp: 4/181b lim: 100 exec/s: 0 rss: 71Mb L: 25/79 MS: 1 InsertRepeatedBytes- 00:08:26.649 [2024-04-24 19:15:13.470732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15914876658197200860 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.470758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.649 [2024-04-24 19:15:13.470802] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.470818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.649 [2024-04-24 19:15:13.470876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.470892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.649 #28 NEW cov: 11966 ft: 13511 corp: 5/260b lim: 100 exec/s: 0 rss: 71Mb L: 79/79 MS: 1 ChangeBit- 00:08:26.649 [2024-04-24 19:15:13.520861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.520888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.649 [2024-04-24 19:15:13.520934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.520948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.649 [2024-04-24 19:15:13.521004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.521020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.649 #29 NEW cov: 11966 ft: 13633 corp: 6/336b lim: 100 exec/s: 0 rss: 72Mb L: 76/79 MS: 1 ChangeByte- 00:08:26.649 [2024-04-24 19:15:13.560992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.561018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.649 [2024-04-24 19:15:13.561062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.561078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.649 [2024-04-24 19:15:13.561133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.649 [2024-04-24 19:15:13.561149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.649 #30 NEW cov: 11966 ft: 13679 corp: 7/412b lim: 100 exec/s: 0 rss: 72Mb L: 76/79 MS: 1 ChangeBinInt- 00:08:26.650 [2024-04-24 19:15:13.601234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15914876658197200860 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.601261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.650 [2024-04-24 19:15:13.601308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.601323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.650 [2024-04-24 19:15:13.601376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.601393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.650 [2024-04-24 19:15:13.601446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.601462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.650 #31 NEW cov: 11966 ft: 14143 corp: 8/492b lim: 100 exec/s: 0 rss: 72Mb L: 80/80 MS: 1 InsertByte- 00:08:26.650 [2024-04-24 19:15:13.641376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.641402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.650 [2024-04-24 19:15:13.641447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.641463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.650 [2024-04-24 19:15:13.641535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.641552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.650 [2024-04-24 19:15:13.641608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551379 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.650 [2024-04-24 19:15:13.641625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.907 #32 NEW cov: 11966 ft: 14152 corp: 9/572b lim: 100 exec/s: 0 rss: 72Mb L: 80/80 MS: 1 CopyPart- 00:08:26.907 [2024-04-24 19:15:13.691362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.691390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.691429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.691446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.691503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.691521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.907 #33 NEW cov: 11966 ft: 14250 corp: 10/649b lim: 100 exec/s: 0 rss: 72Mb L: 77/80 MS: 1 CrossOver- 00:08:26.907 [2024-04-24 19:15:13.731464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.731493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.731528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.731546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.731602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.731617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.907 #34 NEW cov: 11966 ft: 14338 corp: 11/726b lim: 100 exec/s: 0 rss: 72Mb L: 77/80 MS: 1 CopyPart- 00:08:26.907 [2024-04-24 19:15:13.771541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15914876658197200860 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.771569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.771606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.771623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.771681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.771697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.907 #35 NEW cov: 11966 ft: 14347 corp: 12/792b lim: 100 exec/s: 0 rss: 72Mb L: 66/80 MS: 1 EraseBytes- 00:08:26.907 [2024-04-24 19:15:13.811664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.811692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.811730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.811745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.811801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.811817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.907 #36 NEW cov: 11966 ft: 14375 corp: 13/863b lim: 100 exec/s: 0 rss: 72Mb L: 71/80 MS: 1 InsertRepeatedBytes- 00:08:26.907 [2024-04-24 19:15:13.851826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.851853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.851887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.851904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.851960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.851975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.907 #37 NEW cov: 11966 ft: 14408 corp: 14/939b lim: 100 exec/s: 0 rss: 72Mb L: 76/80 MS: 1 ShuffleBytes- 00:08:26.907 [2024-04-24 19:15:13.892071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.892100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.892174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.892191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.892248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.892265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.907 [2024-04-24 19:15:13.892322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.907 [2024-04-24 19:15:13.892337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.907 #38 NEW cov: 11966 ft: 14430 corp: 15/1020b lim: 100 exec/s: 0 rss: 72Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:27.165 [2024-04-24 19:15:13.942096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:13.942126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.165 [2024-04-24 19:15:13.942163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3472328296227680304 len:12337 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:13.942180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.165 [2024-04-24 19:15:13.942234] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3472328296227680304 len:12338 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:13.942251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.165 NEW_FUNC[1/1]: 0x19c2a80 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:27.165 #39 NEW cov: 11989 ft: 14473 corp: 16/1091b lim: 100 exec/s: 0 rss: 73Mb L: 71/81 MS: 1 ChangeASCIIInt- 00:08:27.165 [2024-04-24 19:15:13.992392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:13.992419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.165 [2024-04-24 19:15:13.992467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551610 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:13.992483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.165 [2024-04-24 19:15:13.992541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:13.992557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.165 [2024-04-24 19:15:13.992613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:13.992629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.165 #40 NEW cov: 11989 ft: 14568 corp: 17/1172b lim: 100 exec/s: 0 rss: 73Mb L: 81/81 MS: 1 ChangeBinInt- 00:08:27.165 [2024-04-24 19:15:14.042515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:14.042542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.165 [2024-04-24 19:15:14.042585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18086456103519911935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.165 [2024-04-24 19:15:14.042599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.165 [2024-04-24 19:15:14.042654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.042670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.166 [2024-04-24 19:15:14.042726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.042742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.166 #41 NEW cov: 11989 ft: 14603 corp: 18/1254b lim: 100 exec/s: 41 rss: 73Mb L: 82/82 MS: 1 InsertByte- 00:08:27.166 [2024-04-24 19:15:14.092649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.092675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.166 [2024-04-24 19:15:14.092733] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.092749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.166 [2024-04-24 19:15:14.092804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.092821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.166 [2024-04-24 19:15:14.092876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551379 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.092892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.166 #42 NEW cov: 11989 ft: 14642 corp: 19/1334b lim: 100 exec/s: 42 rss: 73Mb L: 80/82 MS: 1 ChangeBinInt- 00:08:27.166 [2024-04-24 19:15:14.142750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15914876658197200860 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.142776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.166 [2024-04-24 19:15:14.142824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.142840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.166 [2024-04-24 19:15:14.142912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:48128 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.142929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.166 [2024-04-24 19:15:14.142987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.166 [2024-04-24 19:15:14.143003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.166 #43 NEW cov: 11989 ft: 14690 corp: 20/1414b lim: 100 exec/s: 43 rss: 73Mb L: 80/82 MS: 1 InsertByte- 00:08:27.424 [2024-04-24 19:15:14.192758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.192784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.192825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.192840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.192895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.192911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.424 #44 NEW cov: 11989 ft: 14706 corp: 21/1485b lim: 100 exec/s: 44 rss: 73Mb L: 71/82 MS: 1 ChangeBit- 00:08:27.424 [2024-04-24 19:15:14.232997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.233023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.233076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.233092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.233146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744069414584400 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.233163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.233217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551379 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.233234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.424 #45 NEW cov: 11989 ft: 14716 corp: 22/1565b lim: 100 exec/s: 45 rss: 73Mb L: 80/82 MS: 1 ChangeBinInt- 00:08:27.424 [2024-04-24 19:15:14.282883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.282909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.282956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744069414604031 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.282972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.424 #46 NEW cov: 11989 ft: 15003 corp: 23/1606b lim: 100 exec/s: 46 rss: 73Mb L: 41/82 MS: 1 EraseBytes- 00:08:27.424 [2024-04-24 19:15:14.323292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7595718150524501865 len:26986 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.323318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.323367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744071183100415 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.323384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.323438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.323454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.323507] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.323522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.424 #47 NEW cov: 11989 ft: 15087 corp: 24/1702b lim: 100 exec/s: 47 rss: 73Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:27.424 [2024-04-24 19:15:14.363124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.363154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.363212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.363229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.424 #48 NEW cov: 11989 ft: 15119 corp: 25/1746b lim: 100 exec/s: 48 rss: 73Mb L: 44/96 MS: 1 EraseBytes- 00:08:27.424 [2024-04-24 19:15:14.403368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.403394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.403440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.403456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.424 [2024-04-24 19:15:14.403527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.424 [2024-04-24 19:15:14.403542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.424 #49 NEW cov: 11989 ft: 15130 corp: 26/1823b lim: 100 exec/s: 49 rss: 73Mb L: 77/96 MS: 1 InsertByte- 00:08:27.682 [2024-04-24 19:15:14.453667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.453694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.453744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551610 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.453761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.453815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.453832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.453885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.453901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.682 #50 NEW cov: 11989 ft: 15143 corp: 27/1904b lim: 100 exec/s: 50 rss: 73Mb L: 81/96 MS: 1 ShuffleBytes- 00:08:27.682 [2024-04-24 19:15:14.493593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.493618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.493668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:14023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.493684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.493741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.493757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.682 #51 NEW cov: 11989 ft: 15145 corp: 28/1975b lim: 100 exec/s: 51 rss: 73Mb L: 71/96 MS: 1 ChangeBinInt- 00:08:27.682 [2024-04-24 19:15:14.533723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.533749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.533796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.533811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.533867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.533881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.682 #52 NEW cov: 11989 ft: 15150 corp: 29/2046b lim: 100 exec/s: 52 rss: 73Mb L: 71/96 MS: 1 CMP- DE: "\001\004"- 00:08:27.682 [2024-04-24 19:15:14.573675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.573701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.573753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.573770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.682 #56 NEW cov: 11989 ft: 15211 corp: 30/2087b lim: 100 exec/s: 56 rss: 74Mb L: 41/96 MS: 4 CopyPart-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:27.682 [2024-04-24 19:15:14.614097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15914876658197200860 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.614125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.614178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.614195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.614248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:48128 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.614264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.614319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.614334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.682 #57 NEW cov: 11989 ft: 15220 corp: 31/2167b lim: 100 exec/s: 57 rss: 74Mb L: 80/96 MS: 1 ChangeBit- 00:08:27.682 [2024-04-24 19:15:14.664252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:7595718150524501865 len:26986 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.664280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.664330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744071183100415 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.664346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.664415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.664432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.682 [2024-04-24 19:15:14.664486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.682 [2024-04-24 19:15:14.664501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.682 #58 NEW cov: 11989 ft: 15236 corp: 32/2266b lim: 100 exec/s: 58 rss: 74Mb L: 99/99 MS: 1 CopyPart- 00:08:27.940 [2024-04-24 19:15:14.714243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.714269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.940 [2024-04-24 19:15:14.714315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.714331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.940 [2024-04-24 19:15:14.714385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.714400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.940 #59 NEW cov: 11989 ft: 15240 corp: 33/2337b lim: 100 exec/s: 59 rss: 74Mb L: 71/99 MS: 1 ShuffleBytes- 00:08:27.940 [2024-04-24 19:15:14.754465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.754492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.940 [2024-04-24 19:15:14.754542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.754558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.940 [2024-04-24 19:15:14.754610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744070324107007 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.754626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.940 [2024-04-24 19:15:14.754677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.754692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.940 #60 NEW cov: 11989 ft: 15362 corp: 34/2420b lim: 100 exec/s: 60 rss: 74Mb L: 83/99 MS: 1 InsertRepeatedBytes- 00:08:27.940 [2024-04-24 19:15:14.794607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.794636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.940 [2024-04-24 19:15:14.794679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.940 [2024-04-24 19:15:14.794696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.794749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.794765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.794820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.794835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.941 #61 NEW cov: 11989 ft: 15370 corp: 35/2500b lim: 100 exec/s: 61 rss: 74Mb L: 80/99 MS: 1 CopyPart- 00:08:27.941 [2024-04-24 19:15:14.834684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.834711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.834763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551610 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.834780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.834834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:17870283321406127872 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.834849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.834904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.834920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.941 #62 NEW cov: 11989 ft: 15443 corp: 36/2581b lim: 100 exec/s: 62 rss: 74Mb L: 81/99 MS: 1 ChangeBinInt- 00:08:27.941 [2024-04-24 19:15:14.874705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3906369332517942838 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.874731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.874796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3906369333256140342 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.874813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.874879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3906369333256126724 len:13879 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.874894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.941 #63 NEW cov: 11989 ft: 15479 corp: 37/2652b lim: 100 exec/s: 63 rss: 74Mb L: 71/99 MS: 1 PersAutoDict- DE: "\001\004"- 00:08:27.941 [2024-04-24 19:15:14.914791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.914816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.914865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.914882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.914954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.914971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.941 #64 NEW cov: 11989 ft: 15486 corp: 38/2728b lim: 100 exec/s: 64 rss: 74Mb L: 76/99 MS: 1 CrossOver- 00:08:27.941 [2024-04-24 19:15:14.954943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.954970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.955017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.955033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.941 [2024-04-24 19:15:14.955089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.941 [2024-04-24 19:15:14.955106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.199 #65 NEW cov: 11989 ft: 15519 corp: 39/2804b lim: 100 exec/s: 65 rss: 74Mb L: 76/99 MS: 1 ShuffleBytes- 00:08:28.199 [2024-04-24 19:15:14.995189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:15914876658197200860 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.199 [2024-04-24 19:15:14.995215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.200 [2024-04-24 19:15:14.995262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.200 [2024-04-24 19:15:14.995279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.200 [2024-04-24 19:15:14.995331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.200 [2024-04-24 19:15:14.995348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.200 [2024-04-24 19:15:14.995401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:3170534137668829183 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.200 [2024-04-24 19:15:14.995417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.200 #66 NEW cov: 11989 ft: 15544 corp: 40/2889b lim: 100 exec/s: 66 rss: 74Mb L: 85/99 MS: 1 CopyPart- 00:08:28.200 [2024-04-24 19:15:15.045303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.200 [2024-04-24 19:15:15.045331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.200 [2024-04-24 19:15:15.045371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.200 [2024-04-24 19:15:15.045388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.200 [2024-04-24 19:15:15.045442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.200 [2024-04-24 19:15:15.045458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.200 [2024-04-24 19:15:15.045510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.200 [2024-04-24 19:15:15.045526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.200 #67 NEW cov: 11989 ft: 15560 corp: 41/2975b lim: 100 exec/s: 33 rss: 74Mb L: 86/99 MS: 1 InsertRepeatedBytes- 00:08:28.200 #67 DONE cov: 11989 ft: 15560 corp: 41/2975b lim: 100 exec/s: 33 rss: 74Mb 00:08:28.200 ###### Recommended dictionary. ###### 00:08:28.200 "\001\004" # Uses: 1 00:08:28.200 ###### End of recommended dictionary. ###### 00:08:28.200 Done 67 runs in 2 second(s) 00:08:28.200 19:15:15 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:28.200 19:15:15 -- ../common.sh@72 -- # (( i++ )) 00:08:28.200 19:15:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.200 19:15:15 -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:28.200 00:08:28.200 real 1m6.357s 00:08:28.200 user 1m40.567s 00:08:28.200 sys 0m8.855s 00:08:28.457 19:15:15 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:28.457 19:15:15 -- common/autotest_common.sh@10 -- # set +x 00:08:28.458 ************************************ 00:08:28.458 END TEST nvmf_fuzz 00:08:28.458 ************************************ 00:08:28.458 19:15:15 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:28.458 19:15:15 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:28.458 19:15:15 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:28.458 19:15:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:28.458 19:15:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:28.458 19:15:15 -- common/autotest_common.sh@10 -- # set +x 00:08:28.458 ************************************ 00:08:28.458 START TEST vfio_fuzz 00:08:28.458 ************************************ 00:08:28.458 19:15:15 -- common/autotest_common.sh@1111 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:28.458 * Looking for test storage... 00:08:28.458 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.458 19:15:15 -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:28.458 19:15:15 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:28.458 19:15:15 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:28.458 19:15:15 -- common/autotest_common.sh@34 -- # set -e 00:08:28.458 19:15:15 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:28.458 19:15:15 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:28.458 19:15:15 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:28.458 19:15:15 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:28.458 19:15:15 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:28.458 19:15:15 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:28.458 19:15:15 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:28.719 19:15:15 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:28.719 19:15:15 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:28.719 19:15:15 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:28.719 19:15:15 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:28.719 19:15:15 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:28.719 19:15:15 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:28.719 19:15:15 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:28.719 19:15:15 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:28.719 19:15:15 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:28.719 19:15:15 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:28.719 19:15:15 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:28.719 19:15:15 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:28.719 19:15:15 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:28.719 19:15:15 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:28.719 19:15:15 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:28.719 19:15:15 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:28.719 19:15:15 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:28.719 19:15:15 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:28.719 19:15:15 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:28.719 19:15:15 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:28.719 19:15:15 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:28.719 19:15:15 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:28.719 19:15:15 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:28.719 19:15:15 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:28.719 19:15:15 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:28.719 19:15:15 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:28.719 19:15:15 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:28.719 19:15:15 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:28.719 19:15:15 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:28.719 19:15:15 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:28.719 19:15:15 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:28.719 19:15:15 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:28.719 19:15:15 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:28.719 19:15:15 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:28.719 19:15:15 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:28.719 19:15:15 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:28.719 19:15:15 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:28.719 19:15:15 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:28.719 19:15:15 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:28.719 19:15:15 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:28.719 19:15:15 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:28.719 19:15:15 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:28.719 19:15:15 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:28.719 19:15:15 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:28.719 19:15:15 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:28.719 19:15:15 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:28.719 19:15:15 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:28.719 19:15:15 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:28.719 19:15:15 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:28.719 19:15:15 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:28.719 19:15:15 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:08:28.719 19:15:15 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:08:28.719 19:15:15 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:08:28.719 19:15:15 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:08:28.719 19:15:15 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:08:28.719 19:15:15 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:08:28.719 19:15:15 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:08:28.719 19:15:15 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:08:28.719 19:15:15 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:08:28.719 19:15:15 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:08:28.719 19:15:15 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:08:28.719 19:15:15 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:08:28.719 19:15:15 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:08:28.719 19:15:15 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:08:28.719 19:15:15 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:08:28.719 19:15:15 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:28.719 19:15:15 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:08:28.719 19:15:15 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:08:28.719 19:15:15 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:08:28.719 19:15:15 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:08:28.719 19:15:15 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:08:28.719 19:15:15 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:08:28.719 19:15:15 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:08:28.719 19:15:15 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:08:28.719 19:15:15 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:08:28.719 19:15:15 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:08:28.719 19:15:15 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:08:28.719 19:15:15 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:28.719 19:15:15 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:08:28.719 19:15:15 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:08:28.719 19:15:15 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:28.719 19:15:15 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:28.719 19:15:15 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:28.719 19:15:15 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:28.719 19:15:15 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:28.719 19:15:15 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:28.719 19:15:15 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:28.719 19:15:15 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:28.719 19:15:15 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:28.719 19:15:15 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:28.719 19:15:15 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:28.719 19:15:15 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:28.719 19:15:15 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:28.719 19:15:15 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:28.719 19:15:15 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:28.719 19:15:15 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:28.719 #define SPDK_CONFIG_H 00:08:28.719 #define SPDK_CONFIG_APPS 1 00:08:28.719 #define SPDK_CONFIG_ARCH native 00:08:28.719 #undef SPDK_CONFIG_ASAN 00:08:28.719 #undef SPDK_CONFIG_AVAHI 00:08:28.719 #undef SPDK_CONFIG_CET 00:08:28.719 #define SPDK_CONFIG_COVERAGE 1 00:08:28.719 #define SPDK_CONFIG_CROSS_PREFIX 00:08:28.719 #undef SPDK_CONFIG_CRYPTO 00:08:28.719 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:28.719 #undef SPDK_CONFIG_CUSTOMOCF 00:08:28.719 #undef SPDK_CONFIG_DAOS 00:08:28.719 #define SPDK_CONFIG_DAOS_DIR 00:08:28.719 #define SPDK_CONFIG_DEBUG 1 00:08:28.719 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:28.719 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:28.719 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:28.719 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:28.719 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:28.719 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:28.719 #define SPDK_CONFIG_EXAMPLES 1 00:08:28.719 #undef SPDK_CONFIG_FC 00:08:28.719 #define SPDK_CONFIG_FC_PATH 00:08:28.719 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:28.719 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:28.719 #undef SPDK_CONFIG_FUSE 00:08:28.719 #define SPDK_CONFIG_FUZZER 1 00:08:28.719 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:28.719 #undef SPDK_CONFIG_GOLANG 00:08:28.719 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:28.719 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:28.719 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:28.719 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:08:28.719 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:28.719 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:28.719 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:28.719 #define SPDK_CONFIG_IDXD 1 00:08:28.719 #undef SPDK_CONFIG_IDXD_KERNEL 00:08:28.719 #undef SPDK_CONFIG_IPSEC_MB 00:08:28.719 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:28.719 #define SPDK_CONFIG_ISAL 1 00:08:28.719 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:28.719 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:28.719 #define SPDK_CONFIG_LIBDIR 00:08:28.719 #undef SPDK_CONFIG_LTO 00:08:28.719 #define SPDK_CONFIG_MAX_LCORES 00:08:28.719 #define SPDK_CONFIG_NVME_CUSE 1 00:08:28.719 #undef SPDK_CONFIG_OCF 00:08:28.719 #define SPDK_CONFIG_OCF_PATH 00:08:28.719 #define SPDK_CONFIG_OPENSSL_PATH 00:08:28.719 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:28.719 #define SPDK_CONFIG_PGO_DIR 00:08:28.719 #undef SPDK_CONFIG_PGO_USE 00:08:28.719 #define SPDK_CONFIG_PREFIX /usr/local 00:08:28.719 #undef SPDK_CONFIG_RAID5F 00:08:28.719 #undef SPDK_CONFIG_RBD 00:08:28.719 #define SPDK_CONFIG_RDMA 1 00:08:28.719 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:28.719 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:28.719 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:28.719 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:28.719 #undef SPDK_CONFIG_SHARED 00:08:28.719 #undef SPDK_CONFIG_SMA 00:08:28.719 #define SPDK_CONFIG_TESTS 1 00:08:28.719 #undef SPDK_CONFIG_TSAN 00:08:28.719 #define SPDK_CONFIG_UBLK 1 00:08:28.719 #define SPDK_CONFIG_UBSAN 1 00:08:28.719 #undef SPDK_CONFIG_UNIT_TESTS 00:08:28.719 #undef SPDK_CONFIG_URING 00:08:28.719 #define SPDK_CONFIG_URING_PATH 00:08:28.719 #undef SPDK_CONFIG_URING_ZNS 00:08:28.719 #undef SPDK_CONFIG_USDT 00:08:28.720 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:28.720 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:28.720 #define SPDK_CONFIG_VFIO_USER 1 00:08:28.720 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:28.720 #define SPDK_CONFIG_VHOST 1 00:08:28.720 #define SPDK_CONFIG_VIRTIO 1 00:08:28.720 #undef SPDK_CONFIG_VTUNE 00:08:28.720 #define SPDK_CONFIG_VTUNE_DIR 00:08:28.720 #define SPDK_CONFIG_WERROR 1 00:08:28.720 #define SPDK_CONFIG_WPDK_DIR 00:08:28.720 #undef SPDK_CONFIG_XNVME 00:08:28.720 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:28.720 19:15:15 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:28.720 19:15:15 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:28.720 19:15:15 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:28.720 19:15:15 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:28.720 19:15:15 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:28.720 19:15:15 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.720 19:15:15 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.720 19:15:15 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.720 19:15:15 -- paths/export.sh@5 -- # export PATH 00:08:28.720 19:15:15 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.720 19:15:15 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:28.720 19:15:15 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:28.720 19:15:15 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:28.720 19:15:15 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:28.720 19:15:15 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:28.720 19:15:15 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:28.720 19:15:15 -- pm/common@67 -- # TEST_TAG=N/A 00:08:28.720 19:15:15 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:28.720 19:15:15 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:28.720 19:15:15 -- pm/common@71 -- # uname -s 00:08:28.720 19:15:15 -- pm/common@71 -- # PM_OS=Linux 00:08:28.720 19:15:15 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:28.720 19:15:15 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:08:28.720 19:15:15 -- pm/common@76 -- # [[ Linux == Linux ]] 00:08:28.720 19:15:15 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:08:28.720 19:15:15 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:08:28.720 19:15:15 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:28.720 19:15:15 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:28.720 19:15:15 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:08:28.720 19:15:15 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:08:28.720 19:15:15 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:28.720 19:15:15 -- common/autotest_common.sh@57 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:08:28.720 19:15:15 -- common/autotest_common.sh@61 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:28.720 19:15:15 -- common/autotest_common.sh@63 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:08:28.720 19:15:15 -- common/autotest_common.sh@65 -- # : 1 00:08:28.720 19:15:15 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:28.720 19:15:15 -- common/autotest_common.sh@67 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:08:28.720 19:15:15 -- common/autotest_common.sh@69 -- # : 00:08:28.720 19:15:15 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:08:28.720 19:15:15 -- common/autotest_common.sh@71 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:08:28.720 19:15:15 -- common/autotest_common.sh@73 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:08:28.720 19:15:15 -- common/autotest_common.sh@75 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:08:28.720 19:15:15 -- common/autotest_common.sh@77 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:28.720 19:15:15 -- common/autotest_common.sh@79 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:08:28.720 19:15:15 -- common/autotest_common.sh@81 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:08:28.720 19:15:15 -- common/autotest_common.sh@83 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:08:28.720 19:15:15 -- common/autotest_common.sh@85 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:08:28.720 19:15:15 -- common/autotest_common.sh@87 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:08:28.720 19:15:15 -- common/autotest_common.sh@89 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:08:28.720 19:15:15 -- common/autotest_common.sh@91 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:08:28.720 19:15:15 -- common/autotest_common.sh@93 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:08:28.720 19:15:15 -- common/autotest_common.sh@95 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:28.720 19:15:15 -- common/autotest_common.sh@97 -- # : 1 00:08:28.720 19:15:15 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:08:28.720 19:15:15 -- common/autotest_common.sh@99 -- # : 1 00:08:28.720 19:15:15 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:08:28.720 19:15:15 -- common/autotest_common.sh@101 -- # : rdma 00:08:28.720 19:15:15 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:28.720 19:15:15 -- common/autotest_common.sh@103 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:08:28.720 19:15:15 -- common/autotest_common.sh@105 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:08:28.720 19:15:15 -- common/autotest_common.sh@107 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:08:28.720 19:15:15 -- common/autotest_common.sh@109 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:08:28.720 19:15:15 -- common/autotest_common.sh@111 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:08:28.720 19:15:15 -- common/autotest_common.sh@113 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:08:28.720 19:15:15 -- common/autotest_common.sh@115 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:08:28.720 19:15:15 -- common/autotest_common.sh@117 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:28.720 19:15:15 -- common/autotest_common.sh@119 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:08:28.720 19:15:15 -- common/autotest_common.sh@121 -- # : 1 00:08:28.720 19:15:15 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:08:28.720 19:15:15 -- common/autotest_common.sh@123 -- # : 00:08:28.720 19:15:15 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:28.720 19:15:15 -- common/autotest_common.sh@125 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:08:28.720 19:15:15 -- common/autotest_common.sh@127 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:08:28.720 19:15:15 -- common/autotest_common.sh@129 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:08:28.720 19:15:15 -- common/autotest_common.sh@131 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:08:28.720 19:15:15 -- common/autotest_common.sh@133 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:08:28.720 19:15:15 -- common/autotest_common.sh@135 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:08:28.720 19:15:15 -- common/autotest_common.sh@137 -- # : 00:08:28.720 19:15:15 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:08:28.720 19:15:15 -- common/autotest_common.sh@139 -- # : true 00:08:28.720 19:15:15 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:08:28.720 19:15:15 -- common/autotest_common.sh@141 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:08:28.720 19:15:15 -- common/autotest_common.sh@143 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:08:28.720 19:15:15 -- common/autotest_common.sh@145 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:08:28.720 19:15:15 -- common/autotest_common.sh@147 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:08:28.720 19:15:15 -- common/autotest_common.sh@149 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:08:28.720 19:15:15 -- common/autotest_common.sh@151 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:08:28.720 19:15:15 -- common/autotest_common.sh@153 -- # : 00:08:28.720 19:15:15 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:08:28.720 19:15:15 -- common/autotest_common.sh@155 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:08:28.720 19:15:15 -- common/autotest_common.sh@157 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:08:28.720 19:15:15 -- common/autotest_common.sh@159 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:08:28.720 19:15:15 -- common/autotest_common.sh@161 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:08:28.720 19:15:15 -- common/autotest_common.sh@163 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:08:28.720 19:15:15 -- common/autotest_common.sh@166 -- # : 00:08:28.720 19:15:15 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:08:28.720 19:15:15 -- common/autotest_common.sh@168 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:08:28.720 19:15:15 -- common/autotest_common.sh@170 -- # : 0 00:08:28.720 19:15:15 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:28.720 19:15:15 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.720 19:15:15 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:28.720 19:15:15 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:28.720 19:15:15 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:28.720 19:15:15 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:28.720 19:15:15 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:28.720 19:15:15 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:08:28.720 19:15:15 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:28.720 19:15:15 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:28.720 19:15:15 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:28.720 19:15:15 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:28.720 19:15:15 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:28.720 19:15:15 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:08:28.720 19:15:15 -- common/autotest_common.sh@199 -- # cat 00:08:28.720 19:15:15 -- common/autotest_common.sh@225 -- # echo leak:libfuse3.so 00:08:28.720 19:15:15 -- common/autotest_common.sh@227 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:28.720 19:15:15 -- common/autotest_common.sh@227 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:28.720 19:15:15 -- common/autotest_common.sh@229 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:28.720 19:15:15 -- common/autotest_common.sh@229 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:28.720 19:15:15 -- common/autotest_common.sh@231 -- # '[' -z /var/spdk/dependencies ']' 00:08:28.720 19:15:15 -- common/autotest_common.sh@234 -- # export DEPENDENCY_DIR 00:08:28.720 19:15:15 -- common/autotest_common.sh@238 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:28.720 19:15:15 -- common/autotest_common.sh@238 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:28.720 19:15:15 -- common/autotest_common.sh@239 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:28.720 19:15:15 -- common/autotest_common.sh@239 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:28.720 19:15:15 -- common/autotest_common.sh@242 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:28.720 19:15:15 -- common/autotest_common.sh@242 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:28.720 19:15:15 -- common/autotest_common.sh@243 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:28.720 19:15:15 -- common/autotest_common.sh@243 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:28.720 19:15:15 -- common/autotest_common.sh@245 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:28.720 19:15:15 -- common/autotest_common.sh@245 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:28.720 19:15:15 -- common/autotest_common.sh@248 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:28.720 19:15:15 -- common/autotest_common.sh@248 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:28.720 19:15:15 -- common/autotest_common.sh@251 -- # '[' 0 -eq 0 ']' 00:08:28.720 19:15:15 -- common/autotest_common.sh@252 -- # export valgrind= 00:08:28.720 19:15:15 -- common/autotest_common.sh@252 -- # valgrind= 00:08:28.720 19:15:15 -- common/autotest_common.sh@258 -- # uname -s 00:08:28.720 19:15:15 -- common/autotest_common.sh@258 -- # '[' Linux = Linux ']' 00:08:28.720 19:15:15 -- common/autotest_common.sh@259 -- # HUGEMEM=4096 00:08:28.720 19:15:15 -- common/autotest_common.sh@260 -- # export CLEAR_HUGE=yes 00:08:28.720 19:15:15 -- common/autotest_common.sh@260 -- # CLEAR_HUGE=yes 00:08:28.720 19:15:15 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:08:28.720 19:15:15 -- common/autotest_common.sh@261 -- # [[ 0 -eq 1 ]] 00:08:28.720 19:15:15 -- common/autotest_common.sh@268 -- # MAKE=make 00:08:28.720 19:15:15 -- common/autotest_common.sh@269 -- # MAKEFLAGS=-j72 00:08:28.720 19:15:15 -- common/autotest_common.sh@285 -- # export HUGEMEM=4096 00:08:28.720 19:15:15 -- common/autotest_common.sh@285 -- # HUGEMEM=4096 00:08:28.720 19:15:15 -- common/autotest_common.sh@287 -- # NO_HUGE=() 00:08:28.720 19:15:15 -- common/autotest_common.sh@288 -- # TEST_MODE= 00:08:28.720 19:15:15 -- common/autotest_common.sh@307 -- # [[ -z 1629465 ]] 00:08:28.720 19:15:15 -- common/autotest_common.sh@307 -- # kill -0 1629465 00:08:28.720 19:15:15 -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:08:28.720 19:15:15 -- common/autotest_common.sh@317 -- # [[ -v testdir ]] 00:08:28.720 19:15:15 -- common/autotest_common.sh@319 -- # local requested_size=2147483648 00:08:28.720 19:15:15 -- common/autotest_common.sh@320 -- # local mount target_dir 00:08:28.720 19:15:15 -- common/autotest_common.sh@322 -- # local -A mounts fss sizes avails uses 00:08:28.720 19:15:15 -- common/autotest_common.sh@323 -- # local source fs size avail mount use 00:08:28.720 19:15:15 -- common/autotest_common.sh@325 -- # local storage_fallback storage_candidates 00:08:28.720 19:15:15 -- common/autotest_common.sh@327 -- # mktemp -udt spdk.XXXXXX 00:08:28.720 19:15:15 -- common/autotest_common.sh@327 -- # storage_fallback=/tmp/spdk.jNpvpH 00:08:28.720 19:15:15 -- common/autotest_common.sh@332 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:28.720 19:15:15 -- common/autotest_common.sh@334 -- # [[ -n '' ]] 00:08:28.720 19:15:15 -- common/autotest_common.sh@339 -- # [[ -n '' ]] 00:08:28.720 19:15:15 -- common/autotest_common.sh@344 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.jNpvpH/tests/vfio /tmp/spdk.jNpvpH 00:08:28.720 19:15:15 -- common/autotest_common.sh@347 -- # requested_size=2214592512 00:08:28.720 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.720 19:15:15 -- common/autotest_common.sh@316 -- # df -T 00:08:28.720 19:15:15 -- common/autotest_common.sh@316 -- # grep -v Filesystem 00:08:28.720 19:15:15 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_devtmpfs 00:08:28.720 19:15:15 -- common/autotest_common.sh@350 -- # fss["$mount"]=devtmpfs 00:08:28.720 19:15:15 -- common/autotest_common.sh@351 -- # avails["$mount"]=67108864 00:08:28.720 19:15:15 -- common/autotest_common.sh@351 -- # sizes["$mount"]=67108864 00:08:28.721 19:15:15 -- common/autotest_common.sh@352 -- # uses["$mount"]=0 00:08:28.721 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # mounts["$mount"]=/dev/pmem0 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # fss["$mount"]=ext2 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # avails["$mount"]=818380800 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # sizes["$mount"]=5284429824 00:08:28.721 19:15:15 -- common/autotest_common.sh@352 -- # uses["$mount"]=4466049024 00:08:28.721 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # mounts["$mount"]=spdk_root 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # fss["$mount"]=overlay 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # avails["$mount"]=86750863360 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # sizes["$mount"]=94508572672 00:08:28.721 19:15:15 -- common/autotest_common.sh@352 -- # uses["$mount"]=7757709312 00:08:28.721 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # avails["$mount"]=47249575936 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254286336 00:08:28.721 19:15:15 -- common/autotest_common.sh@352 -- # uses["$mount"]=4710400 00:08:28.721 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # avails["$mount"]=18895835136 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # sizes["$mount"]=18901716992 00:08:28.721 19:15:15 -- common/autotest_common.sh@352 -- # uses["$mount"]=5881856 00:08:28.721 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # avails["$mount"]=47253733376 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # sizes["$mount"]=47254286336 00:08:28.721 19:15:15 -- common/autotest_common.sh@352 -- # uses["$mount"]=552960 00:08:28.721 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # mounts["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@350 -- # fss["$mount"]=tmpfs 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # avails["$mount"]=9450852352 00:08:28.721 19:15:15 -- common/autotest_common.sh@351 -- # sizes["$mount"]=9450856448 00:08:28.721 19:15:15 -- common/autotest_common.sh@352 -- # uses["$mount"]=4096 00:08:28.721 19:15:15 -- common/autotest_common.sh@349 -- # read -r source fs size use avail _ mount 00:08:28.721 19:15:15 -- common/autotest_common.sh@355 -- # printf '* Looking for test storage...\n' 00:08:28.721 * Looking for test storage... 00:08:28.721 19:15:15 -- common/autotest_common.sh@357 -- # local target_space new_size 00:08:28.721 19:15:15 -- common/autotest_common.sh@358 -- # for target_dir in "${storage_candidates[@]}" 00:08:28.721 19:15:15 -- common/autotest_common.sh@361 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.721 19:15:15 -- common/autotest_common.sh@361 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:28.721 19:15:15 -- common/autotest_common.sh@361 -- # mount=/ 00:08:28.721 19:15:15 -- common/autotest_common.sh@363 -- # target_space=86750863360 00:08:28.721 19:15:15 -- common/autotest_common.sh@364 -- # (( target_space == 0 || target_space < requested_size )) 00:08:28.721 19:15:15 -- common/autotest_common.sh@367 -- # (( target_space >= requested_size )) 00:08:28.721 19:15:15 -- common/autotest_common.sh@369 -- # [[ overlay == tmpfs ]] 00:08:28.721 19:15:15 -- common/autotest_common.sh@369 -- # [[ overlay == ramfs ]] 00:08:28.721 19:15:15 -- common/autotest_common.sh@369 -- # [[ / == / ]] 00:08:28.721 19:15:15 -- common/autotest_common.sh@370 -- # new_size=9972301824 00:08:28.721 19:15:15 -- common/autotest_common.sh@371 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:28.721 19:15:15 -- common/autotest_common.sh@376 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.721 19:15:15 -- common/autotest_common.sh@376 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.721 19:15:15 -- common/autotest_common.sh@377 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.721 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.721 19:15:15 -- common/autotest_common.sh@378 -- # return 0 00:08:28.721 19:15:15 -- common/autotest_common.sh@1668 -- # set -o errtrace 00:08:28.721 19:15:15 -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:08:28.721 19:15:15 -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:28.721 19:15:15 -- common/autotest_common.sh@1672 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:28.721 19:15:15 -- common/autotest_common.sh@1673 -- # true 00:08:28.721 19:15:15 -- common/autotest_common.sh@1675 -- # xtrace_fd 00:08:28.721 19:15:15 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:28.721 19:15:15 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:28.721 19:15:15 -- common/autotest_common.sh@27 -- # exec 00:08:28.721 19:15:15 -- common/autotest_common.sh@29 -- # exec 00:08:28.721 19:15:15 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:28.721 19:15:15 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:28.721 19:15:15 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:28.721 19:15:15 -- common/autotest_common.sh@18 -- # set -x 00:08:28.721 19:15:15 -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:28.721 19:15:15 -- ../common.sh@8 -- # pids=() 00:08:28.721 19:15:15 -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:28.721 19:15:15 -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:28.721 19:15:15 -- vfio/run.sh@68 -- # fuzz_num=7 00:08:28.721 19:15:15 -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:28.721 19:15:15 -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:28.721 19:15:15 -- vfio/run.sh@74 -- # mem_size=0 00:08:28.721 19:15:15 -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:28.721 19:15:15 -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:28.721 19:15:15 -- ../common.sh@69 -- # local fuzz_num=7 00:08:28.721 19:15:15 -- ../common.sh@70 -- # local time=1 00:08:28.721 19:15:15 -- ../common.sh@72 -- # (( i = 0 )) 00:08:28.721 19:15:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.721 19:15:15 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:28.721 19:15:15 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:28.721 19:15:15 -- vfio/run.sh@23 -- # local timen=1 00:08:28.721 19:15:15 -- vfio/run.sh@24 -- # local core=0x1 00:08:28.721 19:15:15 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:28.721 19:15:15 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:28.721 19:15:15 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:28.721 19:15:15 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:28.721 19:15:15 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:28.721 19:15:15 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:28.721 19:15:15 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:28.721 19:15:15 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:28.721 19:15:15 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:28.721 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:28.721 19:15:15 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.721 19:15:15 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:28.721 19:15:15 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:28.721 [2024-04-24 19:15:15.697084] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:28.721 [2024-04-24 19:15:15.697180] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1629507 ] 00:08:28.980 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.980 [2024-04-24 19:15:15.777826] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.980 [2024-04-24 19:15:15.866134] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.239 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.239 INFO: Seed: 1015981673 00:08:29.239 INFO: Loaded 1 modules (345802 inline 8-bit counters): 345802 [0x287774c, 0x28cbe16), 00:08:29.239 INFO: Loaded 1 PC tables (345802 PCs): 345802 [0x28cbe18,0x2e12ab8), 00:08:29.239 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:29.239 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.239 #2 INITED exec/s: 0 rss: 66Mb 00:08:29.239 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.239 This may also happen if the target rejected all inputs we tried so far 00:08:29.239 [2024-04-24 19:15:16.118550] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:29.808 NEW_FUNC[1/633]: 0x481720 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:29.808 NEW_FUNC[2/633]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:29.808 #12 NEW cov: 10787 ft: 10469 corp: 2/7b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 5 ChangeBit-CrossOver-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:29.808 NEW_FUNC[1/1]: 0x123af30 in nvmf_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/nvmf.c:153 00:08:29.808 #18 NEW cov: 10805 ft: 13205 corp: 3/13b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 ChangeBit- 00:08:29.808 #24 NEW cov: 10805 ft: 14598 corp: 4/19b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ChangeBit- 00:08:30.067 NEW_FUNC[1/1]: 0x198efb0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:30.067 #30 NEW cov: 10822 ft: 15141 corp: 5/25b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:30.067 #31 NEW cov: 10822 ft: 15354 corp: 6/31b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:08:30.325 #32 NEW cov: 10822 ft: 15648 corp: 7/37b lim: 6 exec/s: 32 rss: 74Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:30.325 #33 NEW cov: 10825 ft: 16032 corp: 8/43b lim: 6 exec/s: 33 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:30.585 #43 NEW cov: 10825 ft: 16389 corp: 9/49b lim: 6 exec/s: 43 rss: 74Mb L: 6/6 MS: 5 CrossOver-ChangeByte-CopyPart-ChangeByte-InsertByte- 00:08:30.585 #44 NEW cov: 10825 ft: 16591 corp: 10/55b lim: 6 exec/s: 44 rss: 74Mb L: 6/6 MS: 1 CopyPart- 00:08:30.585 #45 NEW cov: 10825 ft: 16615 corp: 11/61b lim: 6 exec/s: 45 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:30.843 #46 NEW cov: 10825 ft: 16666 corp: 12/67b lim: 6 exec/s: 46 rss: 74Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:30.843 #47 NEW cov: 10825 ft: 16683 corp: 13/73b lim: 6 exec/s: 47 rss: 75Mb L: 6/6 MS: 1 CMP- DE: "\001\011"- 00:08:31.102 #53 NEW cov: 10832 ft: 16757 corp: 14/79b lim: 6 exec/s: 53 rss: 75Mb L: 6/6 MS: 1 CopyPart- 00:08:31.102 #55 NEW cov: 10832 ft: 16778 corp: 15/85b lim: 6 exec/s: 55 rss: 75Mb L: 6/6 MS: 2 EraseBytes-InsertByte- 00:08:31.362 #56 NEW cov: 10832 ft: 16817 corp: 16/91b lim: 6 exec/s: 28 rss: 75Mb L: 6/6 MS: 1 ChangeByte- 00:08:31.362 #56 DONE cov: 10832 ft: 16817 corp: 16/91b lim: 6 exec/s: 28 rss: 75Mb 00:08:31.362 ###### Recommended dictionary. ###### 00:08:31.362 "\001\011" # Uses: 0 00:08:31.362 ###### End of recommended dictionary. ###### 00:08:31.362 Done 56 runs in 2 second(s) 00:08:31.362 [2024-04-24 19:15:18.189263] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:31.621 19:15:18 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:31.621 19:15:18 -- ../common.sh@72 -- # (( i++ )) 00:08:31.621 19:15:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.621 19:15:18 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:31.621 19:15:18 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:31.621 19:15:18 -- vfio/run.sh@23 -- # local timen=1 00:08:31.621 19:15:18 -- vfio/run.sh@24 -- # local core=0x1 00:08:31.621 19:15:18 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:31.621 19:15:18 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:31.621 19:15:18 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:31.621 19:15:18 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:31.621 19:15:18 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:31.621 19:15:18 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:31.621 19:15:18 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:31.621 19:15:18 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:31.621 19:15:18 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:31.621 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:31.621 19:15:18 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:31.621 19:15:18 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:31.621 19:15:18 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:31.621 [2024-04-24 19:15:18.514273] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:31.621 [2024-04-24 19:15:18.514350] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1629867 ] 00:08:31.621 EAL: No free 2048 kB hugepages reported on node 1 00:08:31.621 [2024-04-24 19:15:18.593965] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.879 [2024-04-24 19:15:18.680343] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.880 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.880 INFO: Seed: 3829985580 00:08:32.143 INFO: Loaded 1 modules (345802 inline 8-bit counters): 345802 [0x287774c, 0x28cbe16), 00:08:32.143 INFO: Loaded 1 PC tables (345802 PCs): 345802 [0x28cbe18,0x2e12ab8), 00:08:32.143 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:32.143 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.143 #2 INITED exec/s: 0 rss: 66Mb 00:08:32.143 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.143 This may also happen if the target rejected all inputs we tried so far 00:08:32.143 [2024-04-24 19:15:18.932080] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:32.143 [2024-04-24 19:15:18.980081] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.143 [2024-04-24 19:15:18.980109] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.143 [2024-04-24 19:15:18.980142] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.444 NEW_FUNC[1/636]: 0x481cc0 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:32.444 NEW_FUNC[2/636]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:32.444 #22 NEW cov: 10782 ft: 10598 corp: 2/5b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 5 ChangeByte-ChangeBit-CrossOver-InsertByte-InsertByte- 00:08:32.444 [2024-04-24 19:15:19.449077] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.444 [2024-04-24 19:15:19.449119] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.444 [2024-04-24 19:15:19.449138] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.710 #23 NEW cov: 10801 ft: 14364 corp: 3/9b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:32.710 [2024-04-24 19:15:19.622243] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.710 [2024-04-24 19:15:19.622271] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.710 [2024-04-24 19:15:19.622290] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.710 NEW_FUNC[1/1]: 0x198efb0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:32.710 #31 NEW cov: 10818 ft: 14667 corp: 4/13b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 3 EraseBytes-InsertByte-CopyPart- 00:08:32.968 [2024-04-24 19:15:19.794857] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.968 [2024-04-24 19:15:19.794883] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.968 [2024-04-24 19:15:19.794901] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.968 #32 NEW cov: 10821 ft: 15639 corp: 5/17b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:32.968 [2024-04-24 19:15:19.967294] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.968 [2024-04-24 19:15:19.967317] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.968 [2024-04-24 19:15:19.967350] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.226 #33 NEW cov: 10821 ft: 15957 corp: 6/21b lim: 4 exec/s: 33 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:33.226 [2024-04-24 19:15:20.147230] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.226 [2024-04-24 19:15:20.147270] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.226 [2024-04-24 19:15:20.147290] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.484 #34 NEW cov: 10821 ft: 16512 corp: 7/25b lim: 4 exec/s: 34 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:33.484 [2024-04-24 19:15:20.329039] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.484 [2024-04-24 19:15:20.329076] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.484 [2024-04-24 19:15:20.329095] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.484 #35 NEW cov: 10821 ft: 16588 corp: 8/29b lim: 4 exec/s: 35 rss: 74Mb L: 4/4 MS: 1 CopyPart- 00:08:33.741 [2024-04-24 19:15:20.500967] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.741 [2024-04-24 19:15:20.500995] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.741 [2024-04-24 19:15:20.501014] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.741 #36 NEW cov: 10821 ft: 16627 corp: 9/33b lim: 4 exec/s: 36 rss: 74Mb L: 4/4 MS: 1 ChangeBit- 00:08:33.741 [2024-04-24 19:15:20.673941] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.741 [2024-04-24 19:15:20.673965] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.741 [2024-04-24 19:15:20.673983] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.000 #37 NEW cov: 10828 ft: 16656 corp: 10/37b lim: 4 exec/s: 37 rss: 74Mb L: 4/4 MS: 1 CrossOver- 00:08:34.000 [2024-04-24 19:15:20.846580] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:34.000 [2024-04-24 19:15:20.846605] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:34.000 [2024-04-24 19:15:20.846639] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:34.000 #38 NEW cov: 10828 ft: 16701 corp: 11/41b lim: 4 exec/s: 19 rss: 74Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:34.000 #38 DONE cov: 10828 ft: 16701 corp: 11/41b lim: 4 exec/s: 19 rss: 74Mb 00:08:34.000 Done 38 runs in 2 second(s) 00:08:34.000 [2024-04-24 19:15:20.974283] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:34.259 19:15:21 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:34.259 19:15:21 -- ../common.sh@72 -- # (( i++ )) 00:08:34.259 19:15:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.259 19:15:21 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:34.259 19:15:21 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:34.259 19:15:21 -- vfio/run.sh@23 -- # local timen=1 00:08:34.259 19:15:21 -- vfio/run.sh@24 -- # local core=0x1 00:08:34.259 19:15:21 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:34.259 19:15:21 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:34.259 19:15:21 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:34.259 19:15:21 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:34.259 19:15:21 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:34.259 19:15:21 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:34.259 19:15:21 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:34.259 19:15:21 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:34.259 19:15:21 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:34.259 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:34.518 19:15:21 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:34.518 19:15:21 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:34.518 19:15:21 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:34.518 [2024-04-24 19:15:21.303359] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:34.518 [2024-04-24 19:15:21.303446] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630233 ] 00:08:34.518 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.518 [2024-04-24 19:15:21.382879] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.518 [2024-04-24 19:15:21.468138] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.777 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.777 INFO: Seed: 2323013083 00:08:34.777 INFO: Loaded 1 modules (345802 inline 8-bit counters): 345802 [0x287774c, 0x28cbe16), 00:08:34.777 INFO: Loaded 1 PC tables (345802 PCs): 345802 [0x28cbe18,0x2e12ab8), 00:08:34.777 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:34.777 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.777 #2 INITED exec/s: 0 rss: 66Mb 00:08:34.777 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.777 This may also happen if the target rejected all inputs we tried so far 00:08:34.777 [2024-04-24 19:15:21.720103] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:34.777 [2024-04-24 19:15:21.754076] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.293 NEW_FUNC[1/635]: 0x4826a0 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:35.293 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:35.293 #23 NEW cov: 10773 ft: 10732 corp: 2/9b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:35.293 [2024-04-24 19:15:22.239608] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.551 #29 NEW cov: 10787 ft: 13436 corp: 3/17b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 ChangeBit- 00:08:35.551 [2024-04-24 19:15:22.425258] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.551 NEW_FUNC[1/1]: 0x198efb0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:35.551 #30 NEW cov: 10804 ft: 13709 corp: 4/25b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:35.808 [2024-04-24 19:15:22.612855] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.808 #40 NEW cov: 10804 ft: 15262 corp: 5/33b lim: 8 exec/s: 40 rss: 74Mb L: 8/8 MS: 5 InsertByte-ChangeBinInt-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:35.808 [2024-04-24 19:15:22.811193] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.066 #44 NEW cov: 10804 ft: 15875 corp: 6/41b lim: 8 exec/s: 44 rss: 74Mb L: 8/8 MS: 4 ChangeBit-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:08:36.066 [2024-04-24 19:15:22.994170] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.324 #45 NEW cov: 10804 ft: 16391 corp: 7/49b lim: 8 exec/s: 45 rss: 74Mb L: 8/8 MS: 1 ChangeBit- 00:08:36.324 [2024-04-24 19:15:23.179561] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.324 #46 NEW cov: 10804 ft: 16745 corp: 8/57b lim: 8 exec/s: 46 rss: 74Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:36.582 [2024-04-24 19:15:23.369068] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.582 #47 NEW cov: 10811 ft: 16778 corp: 9/65b lim: 8 exec/s: 47 rss: 74Mb L: 8/8 MS: 1 CopyPart- 00:08:36.582 [2024-04-24 19:15:23.552512] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.839 #48 NEW cov: 10811 ft: 16932 corp: 10/73b lim: 8 exec/s: 48 rss: 74Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:36.839 [2024-04-24 19:15:23.736955] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.839 #64 pulse cov: 10811 ft: 17105 corp: 10/73b lim: 8 exec/s: 32 rss: 74Mb 00:08:36.839 #64 NEW cov: 10811 ft: 17105 corp: 11/81b lim: 8 exec/s: 32 rss: 74Mb L: 8/8 MS: 1 ChangeByte- 00:08:36.839 #64 DONE cov: 10811 ft: 17105 corp: 11/81b lim: 8 exec/s: 32 rss: 74Mb 00:08:36.839 Done 64 runs in 2 second(s) 00:08:37.097 [2024-04-24 19:15:23.868276] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:08:37.356 19:15:24 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:08:37.356 19:15:24 -- ../common.sh@72 -- # (( i++ )) 00:08:37.356 19:15:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.356 19:15:24 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:37.356 19:15:24 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:37.356 19:15:24 -- vfio/run.sh@23 -- # local timen=1 00:08:37.356 19:15:24 -- vfio/run.sh@24 -- # local core=0x1 00:08:37.356 19:15:24 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:37.356 19:15:24 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:37.356 19:15:24 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:37.356 19:15:24 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:37.356 19:15:24 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:37.356 19:15:24 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:37.356 19:15:24 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:37.356 19:15:24 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:37.356 19:15:24 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:37.356 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:37.356 19:15:24 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:37.356 19:15:24 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:37.356 19:15:24 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:37.356 [2024-04-24 19:15:24.186095] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:37.356 [2024-04-24 19:15:24.186176] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630596 ] 00:08:37.356 EAL: No free 2048 kB hugepages reported on node 1 00:08:37.356 [2024-04-24 19:15:24.265080] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.356 [2024-04-24 19:15:24.349631] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.615 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.615 INFO: Seed: 908045517 00:08:37.615 INFO: Loaded 1 modules (345802 inline 8-bit counters): 345802 [0x287774c, 0x28cbe16), 00:08:37.615 INFO: Loaded 1 PC tables (345802 PCs): 345802 [0x28cbe18,0x2e12ab8), 00:08:37.615 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:37.615 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.615 #2 INITED exec/s: 0 rss: 66Mb 00:08:37.615 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.615 This may also happen if the target rejected all inputs we tried so far 00:08:37.615 [2024-04-24 19:15:24.610685] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:08:38.132 NEW_FUNC[1/635]: 0x482d80 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:08:38.132 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:38.132 #101 NEW cov: 10778 ft: 10528 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 ChangeBinInt-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:38.390 #107 NEW cov: 10795 ft: 13885 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeByte- 00:08:38.648 NEW_FUNC[1/1]: 0x198efb0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:38.648 #108 NEW cov: 10812 ft: 14412 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:38.648 #114 NEW cov: 10812 ft: 14520 corp: 5/129b lim: 32 exec/s: 114 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:38.906 #120 NEW cov: 10812 ft: 14642 corp: 6/161b lim: 32 exec/s: 120 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:39.163 #121 NEW cov: 10812 ft: 15068 corp: 7/193b lim: 32 exec/s: 121 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:39.163 #122 NEW cov: 10812 ft: 15273 corp: 8/225b lim: 32 exec/s: 122 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:39.421 #123 NEW cov: 10812 ft: 15314 corp: 9/257b lim: 32 exec/s: 123 rss: 74Mb L: 32/32 MS: 1 CMP- DE: "\000\003"- 00:08:39.678 #124 NEW cov: 10819 ft: 15403 corp: 10/289b lim: 32 exec/s: 124 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:39.678 #125 NEW cov: 10819 ft: 16444 corp: 11/321b lim: 32 exec/s: 62 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:39.678 #125 DONE cov: 10819 ft: 16444 corp: 11/321b lim: 32 exec/s: 62 rss: 74Mb 00:08:39.678 ###### Recommended dictionary. ###### 00:08:39.678 "\000\003" # Uses: 0 00:08:39.678 ###### End of recommended dictionary. ###### 00:08:39.678 Done 125 runs in 2 second(s) 00:08:39.678 [2024-04-24 19:15:26.674278] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:08:40.244 19:15:26 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:08:40.244 19:15:26 -- ../common.sh@72 -- # (( i++ )) 00:08:40.244 19:15:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.244 19:15:26 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:40.244 19:15:26 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:40.244 19:15:26 -- vfio/run.sh@23 -- # local timen=1 00:08:40.244 19:15:26 -- vfio/run.sh@24 -- # local core=0x1 00:08:40.244 19:15:26 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:40.244 19:15:26 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:40.244 19:15:26 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:40.244 19:15:26 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:40.244 19:15:26 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:40.244 19:15:26 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:40.244 19:15:26 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:40.244 19:15:26 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:40.244 19:15:26 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:40.244 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:40.244 19:15:26 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:40.244 19:15:26 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:40.244 19:15:26 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:40.244 [2024-04-24 19:15:27.005963] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:40.244 [2024-04-24 19:15:27.006040] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630955 ] 00:08:40.244 EAL: No free 2048 kB hugepages reported on node 1 00:08:40.244 [2024-04-24 19:15:27.086388] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.244 [2024-04-24 19:15:27.171093] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.503 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.503 INFO: Seed: 3722090252 00:08:40.503 INFO: Loaded 1 modules (345802 inline 8-bit counters): 345802 [0x287774c, 0x28cbe16), 00:08:40.503 INFO: Loaded 1 PC tables (345802 PCs): 345802 [0x28cbe18,0x2e12ab8), 00:08:40.503 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:40.503 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.503 #2 INITED exec/s: 0 rss: 66Mb 00:08:40.503 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.503 This may also happen if the target rejected all inputs we tried so far 00:08:40.503 [2024-04-24 19:15:27.412904] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:08:41.018 NEW_FUNC[1/635]: 0x483600 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:08:41.018 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:41.018 #12 NEW cov: 10780 ft: 10748 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 5 ChangeByte-CrossOver-InsertRepeatedBytes-ChangeByte-InsertRepeatedBytes- 00:08:41.276 #13 NEW cov: 10794 ft: 13295 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:41.276 NEW_FUNC[1/1]: 0x198efb0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:41.276 #14 NEW cov: 10814 ft: 14554 corp: 4/97b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeBit- 00:08:41.534 #25 NEW cov: 10814 ft: 15021 corp: 5/129b lim: 32 exec/s: 25 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:41.791 #26 NEW cov: 10814 ft: 15048 corp: 6/161b lim: 32 exec/s: 26 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:41.791 #27 NEW cov: 10814 ft: 16121 corp: 7/193b lim: 32 exec/s: 27 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:42.048 #28 NEW cov: 10814 ft: 16222 corp: 8/225b lim: 32 exec/s: 28 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:42.306 #29 NEW cov: 10814 ft: 17027 corp: 9/257b lim: 32 exec/s: 29 rss: 74Mb L: 32/32 MS: 1 CrossOver- 00:08:42.564 #30 NEW cov: 10821 ft: 17089 corp: 10/289b lim: 32 exec/s: 30 rss: 74Mb L: 32/32 MS: 1 CMP- DE: ":\000"- 00:08:42.564 #31 NEW cov: 10821 ft: 17441 corp: 11/321b lim: 32 exec/s: 15 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:42.564 #31 DONE cov: 10821 ft: 17441 corp: 11/321b lim: 32 exec/s: 15 rss: 74Mb 00:08:42.564 ###### Recommended dictionary. ###### 00:08:42.564 ":\000" # Uses: 0 00:08:42.564 ###### End of recommended dictionary. ###### 00:08:42.564 Done 31 runs in 2 second(s) 00:08:42.564 [2024-04-24 19:15:29.558279] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:08:42.822 19:15:29 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:08:42.822 19:15:29 -- ../common.sh@72 -- # (( i++ )) 00:08:42.822 19:15:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.822 19:15:29 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:42.822 19:15:29 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:42.822 19:15:29 -- vfio/run.sh@23 -- # local timen=1 00:08:42.822 19:15:29 -- vfio/run.sh@24 -- # local core=0x1 00:08:42.822 19:15:29 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:42.822 19:15:29 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:42.822 19:15:29 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:42.822 19:15:29 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:42.822 19:15:29 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:42.822 19:15:29 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:42.822 19:15:29 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:42.822 19:15:29 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:43.080 19:15:29 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:43.080 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:43.080 19:15:29 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:43.080 19:15:29 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:43.080 19:15:29 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:43.080 [2024-04-24 19:15:29.871358] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:43.080 [2024-04-24 19:15:29.871429] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631361 ] 00:08:43.080 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.080 [2024-04-24 19:15:29.949786] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.080 [2024-04-24 19:15:30.042395] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.339 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.339 INFO: Seed: 2303113183 00:08:43.339 INFO: Loaded 1 modules (345802 inline 8-bit counters): 345802 [0x287774c, 0x28cbe16), 00:08:43.339 INFO: Loaded 1 PC tables (345802 PCs): 345802 [0x28cbe18,0x2e12ab8), 00:08:43.339 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:43.339 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.339 #2 INITED exec/s: 0 rss: 66Mb 00:08:43.339 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.339 This may also happen if the target rejected all inputs we tried so far 00:08:43.339 [2024-04-24 19:15:30.299984] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:08:43.597 [2024-04-24 19:15:30.374800] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:43.597 [2024-04-24 19:15:30.374843] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:43.855 NEW_FUNC[1/636]: 0x484000 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:08:43.855 NEW_FUNC[2/636]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:43.855 #27 NEW cov: 10789 ft: 10371 corp: 2/14b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 5 ChangeBit-ChangeBit-InsertRepeatedBytes-CopyPart-InsertByte- 00:08:44.113 [2024-04-24 19:15:30.880207] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.113 [2024-04-24 19:15:30.880249] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.113 #29 NEW cov: 10806 ft: 13154 corp: 3/27b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 2 InsertRepeatedBytes-CMP- DE: "\021\000"- 00:08:44.113 [2024-04-24 19:15:31.083292] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.113 [2024-04-24 19:15:31.083325] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.372 NEW_FUNC[1/1]: 0x198efb0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.372 #30 NEW cov: 10823 ft: 14425 corp: 4/40b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 CopyPart- 00:08:44.372 [2024-04-24 19:15:31.274427] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.372 [2024-04-24 19:15:31.274459] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.630 #31 NEW cov: 10823 ft: 16019 corp: 5/53b lim: 13 exec/s: 31 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:08:44.630 [2024-04-24 19:15:31.464651] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.630 [2024-04-24 19:15:31.464685] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.630 #32 NEW cov: 10823 ft: 16339 corp: 6/66b lim: 13 exec/s: 32 rss: 74Mb L: 13/13 MS: 1 ShuffleBytes- 00:08:44.888 [2024-04-24 19:15:31.653839] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.888 [2024-04-24 19:15:31.653871] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.888 #33 NEW cov: 10823 ft: 16503 corp: 7/79b lim: 13 exec/s: 33 rss: 74Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:44.888 [2024-04-24 19:15:31.843751] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.888 [2024-04-24 19:15:31.843781] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.146 #34 NEW cov: 10823 ft: 16646 corp: 8/92b lim: 13 exec/s: 34 rss: 74Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:45.146 [2024-04-24 19:15:32.032609] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.146 [2024-04-24 19:15:32.032640] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.146 #35 NEW cov: 10830 ft: 16699 corp: 9/105b lim: 13 exec/s: 35 rss: 74Mb L: 13/13 MS: 1 ChangeByte- 00:08:45.405 [2024-04-24 19:15:32.221640] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.405 [2024-04-24 19:15:32.221672] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.405 #37 NEW cov: 10830 ft: 17011 corp: 10/118b lim: 13 exec/s: 18 rss: 74Mb L: 13/13 MS: 2 CrossOver-CrossOver- 00:08:45.405 #37 DONE cov: 10830 ft: 17011 corp: 10/118b lim: 13 exec/s: 18 rss: 74Mb 00:08:45.405 ###### Recommended dictionary. ###### 00:08:45.405 "\021\000" # Uses: 0 00:08:45.405 ###### End of recommended dictionary. ###### 00:08:45.405 Done 37 runs in 2 second(s) 00:08:45.405 [2024-04-24 19:15:32.352282] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:08:45.664 19:15:32 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:08:45.664 19:15:32 -- ../common.sh@72 -- # (( i++ )) 00:08:45.664 19:15:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:45.664 19:15:32 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:45.664 19:15:32 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:45.664 19:15:32 -- vfio/run.sh@23 -- # local timen=1 00:08:45.664 19:15:32 -- vfio/run.sh@24 -- # local core=0x1 00:08:45.664 19:15:32 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:45.664 19:15:32 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:45.664 19:15:32 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:45.664 19:15:32 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:45.664 19:15:32 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:45.664 19:15:32 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:45.664 19:15:32 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:45.664 19:15:32 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:45.664 19:15:32 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:45.664 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:45.664 19:15:32 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:45.664 19:15:32 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:45.664 19:15:32 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:45.664 [2024-04-24 19:15:32.653564] Starting SPDK v24.05-pre git sha1 5c8d451f1 / DPDK 23.11.0 initialization... 00:08:45.664 [2024-04-24 19:15:32.653632] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631782 ] 00:08:45.923 EAL: No free 2048 kB hugepages reported on node 1 00:08:45.923 [2024-04-24 19:15:32.734480] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.923 [2024-04-24 19:15:32.816828] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.181 INFO: Running with entropic power schedule (0xFF, 100). 00:08:46.181 INFO: Seed: 786110200 00:08:46.181 INFO: Loaded 1 modules (345802 inline 8-bit counters): 345802 [0x287774c, 0x28cbe16), 00:08:46.181 INFO: Loaded 1 PC tables (345802 PCs): 345802 [0x28cbe18,0x2e12ab8), 00:08:46.181 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:46.181 INFO: A corpus is not provided, starting from an empty corpus 00:08:46.181 #2 INITED exec/s: 0 rss: 66Mb 00:08:46.181 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:46.181 This may also happen if the target rejected all inputs we tried so far 00:08:46.181 [2024-04-24 19:15:33.071267] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:08:46.181 [2024-04-24 19:15:33.115107] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.181 [2024-04-24 19:15:33.115172] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.700 NEW_FUNC[1/636]: 0x484cf0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:46.700 NEW_FUNC[2/636]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:46.700 #17 NEW cov: 10780 ft: 10632 corp: 2/10b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 5 ChangeByte-InsertByte-CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:46.700 [2024-04-24 19:15:33.604403] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.700 [2024-04-24 19:15:33.604452] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.700 #18 NEW cov: 10798 ft: 13564 corp: 3/19b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:46.959 [2024-04-24 19:15:33.779740] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.959 [2024-04-24 19:15:33.779778] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.959 NEW_FUNC[1/1]: 0x198efb0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.959 #19 NEW cov: 10815 ft: 15350 corp: 4/28b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:46.959 [2024-04-24 19:15:33.968574] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.959 [2024-04-24 19:15:33.968607] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.218 #20 NEW cov: 10815 ft: 16339 corp: 5/37b lim: 9 exec/s: 20 rss: 74Mb L: 9/9 MS: 1 CopyPart- 00:08:47.218 [2024-04-24 19:15:34.144069] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.218 [2024-04-24 19:15:34.144104] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.476 #21 NEW cov: 10815 ft: 16457 corp: 6/46b lim: 9 exec/s: 21 rss: 74Mb L: 9/9 MS: 1 ChangeBit- 00:08:47.476 [2024-04-24 19:15:34.318815] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.476 [2024-04-24 19:15:34.318847] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.476 #32 NEW cov: 10815 ft: 16843 corp: 7/55b lim: 9 exec/s: 32 rss: 74Mb L: 9/9 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:47.735 [2024-04-24 19:15:34.493353] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.735 [2024-04-24 19:15:34.493385] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.735 #33 NEW cov: 10815 ft: 17107 corp: 8/64b lim: 9 exec/s: 33 rss: 74Mb L: 9/9 MS: 1 CopyPart- 00:08:47.735 [2024-04-24 19:15:34.669578] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.735 [2024-04-24 19:15:34.669611] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.993 #34 NEW cov: 10815 ft: 17247 corp: 9/73b lim: 9 exec/s: 34 rss: 74Mb L: 9/9 MS: 1 ChangeByte- 00:08:47.993 [2024-04-24 19:15:34.844844] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.993 [2024-04-24 19:15:34.844877] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.993 #35 NEW cov: 10822 ft: 17731 corp: 10/82b lim: 9 exec/s: 35 rss: 74Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:48.252 [2024-04-24 19:15:35.033277] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:48.252 [2024-04-24 19:15:35.033309] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:48.252 #41 NEW cov: 10822 ft: 18238 corp: 11/91b lim: 9 exec/s: 20 rss: 74Mb L: 9/9 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:48.252 #41 DONE cov: 10822 ft: 18238 corp: 11/91b lim: 9 exec/s: 20 rss: 74Mb 00:08:48.252 ###### Recommended dictionary. ###### 00:08:48.252 "\001\000\000\000" # Uses: 1 00:08:48.252 ###### End of recommended dictionary. ###### 00:08:48.252 Done 41 runs in 2 second(s) 00:08:48.252 [2024-04-24 19:15:35.160263] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:08:48.512 19:15:35 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:08:48.512 19:15:35 -- ../common.sh@72 -- # (( i++ )) 00:08:48.512 19:15:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.512 19:15:35 -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:08:48.512 00:08:48.512 real 0m20.059s 00:08:48.512 user 0m27.515s 00:08:48.512 sys 0m2.021s 00:08:48.512 19:15:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:48.512 19:15:35 -- common/autotest_common.sh@10 -- # set +x 00:08:48.512 ************************************ 00:08:48.512 END TEST vfio_fuzz 00:08:48.512 ************************************ 00:08:48.512 19:15:35 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:08:48.512 00:08:48.512 real 1m26.850s 00:08:48.512 user 2m8.232s 00:08:48.512 sys 0m11.130s 00:08:48.512 19:15:35 -- common/autotest_common.sh@1112 -- # xtrace_disable 00:08:48.512 19:15:35 -- common/autotest_common.sh@10 -- # set +x 00:08:48.512 ************************************ 00:08:48.512 END TEST llvm_fuzz 00:08:48.512 ************************************ 00:08:48.512 19:15:35 -- spdk/autotest.sh@373 -- # [[ 0 -eq 1 ]] 00:08:48.512 19:15:35 -- spdk/autotest.sh@378 -- # trap - SIGINT SIGTERM EXIT 00:08:48.512 19:15:35 -- spdk/autotest.sh@380 -- # timing_enter post_cleanup 00:08:48.512 19:15:35 -- common/autotest_common.sh@710 -- # xtrace_disable 00:08:48.512 19:15:35 -- common/autotest_common.sh@10 -- # set +x 00:08:48.512 19:15:35 -- spdk/autotest.sh@381 -- # autotest_cleanup 00:08:48.512 19:15:35 -- common/autotest_common.sh@1378 -- # local autotest_es=0 00:08:48.512 19:15:35 -- common/autotest_common.sh@1379 -- # xtrace_disable 00:08:48.512 19:15:35 -- common/autotest_common.sh@10 -- # set +x 00:08:52.702 INFO: APP EXITING 00:08:52.702 INFO: killing all VMs 00:08:52.702 INFO: killing vhost app 00:08:52.702 INFO: EXIT DONE 00:08:55.986 Waiting for block devices as requested 00:08:55.986 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:08:55.986 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:55.986 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:55.986 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:55.986 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:55.986 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:55.986 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:56.244 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:56.244 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:56.244 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:56.503 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:56.503 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:56.503 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:56.761 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:56.761 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:56.761 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:57.020 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:02.289 Cleaning 00:09:02.289 Removing: /dev/shm/spdk_tgt_trace.pid1602090 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1599044 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1600578 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1602090 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1602665 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1603420 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1603637 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1604524 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1604574 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1605042 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1605277 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1605529 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1605845 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1606198 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1606404 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1606611 00:09:02.289 Removing: /var/run/dpdk/spdk_pid1606842 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1607612 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1610123 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1610350 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1610706 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1610746 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1611265 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1611317 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1611706 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1611880 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1612099 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1612277 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1612486 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1612509 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1612966 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1613170 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1613380 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1613614 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1613900 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1614037 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1614287 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1614493 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1614693 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1614907 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1615108 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1615366 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1615681 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1615882 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1616085 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1616288 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1616495 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1616758 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1617056 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1617262 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1617470 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1617672 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1617878 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1618099 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1618413 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1618650 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1618858 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1618992 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1619366 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1619943 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1620302 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1620658 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1621017 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1621367 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1621707 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1621996 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1622301 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1622644 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1623003 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1623359 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1623718 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1624072 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1624431 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1624793 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1625150 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1625513 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1625872 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1626233 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1626592 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1627072 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1627629 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1628251 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1628602 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1628970 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1629507 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1629867 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1630233 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1630596 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1630955 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1631361 00:09:02.290 Removing: /var/run/dpdk/spdk_pid1631782 00:09:02.290 Clean 00:09:02.290 19:15:49 -- common/autotest_common.sh@1437 -- # return 0 00:09:02.290 19:15:49 -- spdk/autotest.sh@382 -- # timing_exit post_cleanup 00:09:02.290 19:15:49 -- common/autotest_common.sh@716 -- # xtrace_disable 00:09:02.290 19:15:49 -- common/autotest_common.sh@10 -- # set +x 00:09:02.290 19:15:49 -- spdk/autotest.sh@384 -- # timing_exit autotest 00:09:02.290 19:15:49 -- common/autotest_common.sh@716 -- # xtrace_disable 00:09:02.290 19:15:49 -- common/autotest_common.sh@10 -- # set +x 00:09:02.290 19:15:49 -- spdk/autotest.sh@385 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:02.290 19:15:49 -- spdk/autotest.sh@387 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:02.290 19:15:49 -- spdk/autotest.sh@387 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:02.290 19:15:49 -- spdk/autotest.sh@389 -- # hash lcov 00:09:02.290 19:15:49 -- spdk/autotest.sh@389 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:02.549 19:15:49 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:02.549 19:15:49 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:02.549 19:15:49 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:02.549 19:15:49 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:02.549 19:15:49 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:02.549 19:15:49 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:02.549 19:15:49 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:02.549 19:15:49 -- paths/export.sh@5 -- $ export PATH 00:09:02.549 19:15:49 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:02.549 19:15:49 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:02.549 19:15:49 -- common/autobuild_common.sh@435 -- $ date +%s 00:09:02.549 19:15:49 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1713978949.XXXXXX 00:09:02.549 19:15:49 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1713978949.4W5IrT 00:09:02.549 19:15:49 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:09:02.549 19:15:49 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:09:02.549 19:15:49 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:02.549 19:15:49 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:02.549 19:15:49 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:02.549 19:15:49 -- common/autobuild_common.sh@451 -- $ get_config_params 00:09:02.549 19:15:49 -- common/autotest_common.sh@385 -- $ xtrace_disable 00:09:02.549 19:15:49 -- common/autotest_common.sh@10 -- $ set +x 00:09:02.549 19:15:49 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:02.549 19:15:49 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:09:02.549 19:15:49 -- pm/common@17 -- $ local monitor 00:09:02.549 19:15:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:02.549 19:15:49 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1637475 00:09:02.549 19:15:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:02.549 19:15:49 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1637477 00:09:02.549 19:15:49 -- pm/common@21 -- $ date +%s 00:09:02.549 19:15:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:02.549 19:15:49 -- pm/common@21 -- $ date +%s 00:09:02.549 19:15:49 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1637480 00:09:02.549 19:15:49 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:02.549 19:15:49 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1637485 00:09:02.549 19:15:49 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713978949 00:09:02.549 19:15:49 -- pm/common@21 -- $ date +%s 00:09:02.549 19:15:49 -- pm/common@26 -- $ sleep 1 00:09:02.549 19:15:49 -- pm/common@21 -- $ date +%s 00:09:02.549 19:15:49 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713978949 00:09:02.549 19:15:49 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713978949 00:09:02.549 19:15:49 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1713978949 00:09:02.549 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713978949_collect-cpu-load.pm.log 00:09:02.549 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713978949_collect-vmstat.pm.log 00:09:02.549 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713978949_collect-bmc-pm.bmc.pm.log 00:09:02.549 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1713978949_collect-cpu-temp.pm.log 00:09:03.486 19:15:50 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:09:03.486 19:15:50 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:09:03.486 19:15:50 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:03.486 19:15:50 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:03.486 19:15:50 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:03.486 19:15:50 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:03.486 19:15:50 -- common/autotest_common.sh@722 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:03.486 19:15:50 -- common/autotest_common.sh@723 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:03.486 19:15:50 -- common/autotest_common.sh@725 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:03.486 19:15:50 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:03.486 19:15:50 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:09:03.486 19:15:50 -- pm/common@30 -- $ signal_monitor_resources TERM 00:09:03.486 19:15:50 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:09:03.486 19:15:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:03.486 19:15:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:09:03.486 19:15:50 -- pm/common@45 -- $ pid=1637498 00:09:03.486 19:15:50 -- pm/common@52 -- $ sudo kill -TERM 1637498 00:09:03.486 19:15:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:03.486 19:15:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:09:03.486 19:15:50 -- pm/common@45 -- $ pid=1637507 00:09:03.486 19:15:50 -- pm/common@52 -- $ sudo kill -TERM 1637507 00:09:03.745 19:15:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:03.745 19:15:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:09:03.745 19:15:50 -- pm/common@45 -- $ pid=1637514 00:09:03.745 19:15:50 -- pm/common@52 -- $ sudo kill -TERM 1637514 00:09:03.745 19:15:50 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:03.745 19:15:50 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:09:03.745 19:15:50 -- pm/common@45 -- $ pid=1637513 00:09:03.745 19:15:50 -- pm/common@52 -- $ sudo kill -TERM 1637513 00:09:03.745 + [[ -n 1495660 ]] 00:09:03.745 + sudo kill 1495660 00:09:03.756 [Pipeline] } 00:09:03.774 [Pipeline] // stage 00:09:03.780 [Pipeline] } 00:09:03.800 [Pipeline] // timeout 00:09:03.806 [Pipeline] } 00:09:03.823 [Pipeline] // catchError 00:09:03.828 [Pipeline] } 00:09:03.845 [Pipeline] // wrap 00:09:03.852 [Pipeline] } 00:09:03.866 [Pipeline] // catchError 00:09:03.877 [Pipeline] stage 00:09:03.879 [Pipeline] { (Epilogue) 00:09:03.895 [Pipeline] catchError 00:09:03.897 [Pipeline] { 00:09:03.912 [Pipeline] echo 00:09:03.914 Cleanup processes 00:09:03.920 [Pipeline] sh 00:09:04.199 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:04.199 1552837 sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713978541 00:09:04.199 1552876 bash /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1713978541 00:09:04.199 1637650 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:09:04.199 1638421 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:04.212 [Pipeline] sh 00:09:04.493 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:04.493 ++ grep -v 'sudo pgrep' 00:09:04.493 ++ awk '{print $1}' 00:09:04.493 + sudo kill -9 1637650 00:09:04.507 [Pipeline] sh 00:09:04.832 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:05.819 [Pipeline] sh 00:09:06.101 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:06.101 Artifacts sizes are good 00:09:06.118 [Pipeline] archiveArtifacts 00:09:06.125 Archiving artifacts 00:09:06.188 [Pipeline] sh 00:09:06.469 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:06.485 [Pipeline] cleanWs 00:09:06.494 [WS-CLEANUP] Deleting project workspace... 00:09:06.494 [WS-CLEANUP] Deferred wipeout is used... 00:09:06.500 [WS-CLEANUP] done 00:09:06.502 [Pipeline] } 00:09:06.528 [Pipeline] // catchError 00:09:06.564 [Pipeline] sh 00:09:06.848 + logger -p user.info -t JENKINS-CI 00:09:06.857 [Pipeline] } 00:09:06.872 [Pipeline] // stage 00:09:06.877 [Pipeline] } 00:09:06.893 [Pipeline] // node 00:09:06.899 [Pipeline] End of Pipeline 00:09:06.934 Finished: SUCCESS